Current Search: Adaptive (x)
Pages
-
-
Title
-
Applications of Deep Learning Models for Traffic Prediction Problems.
-
Creator
-
Rahman, Rezaur, Hasan, Samiul, Abdel-Aty, Mohamed, Zaki Hussein, Mohamed, University of Central Florida
-
Abstract / Description
-
Deep learning coupled with existing sensors based multiresolution traffic data and future connected technologies has immense potential to improve traffic operation and management. But to deal with complex transportation problems, we need efficient modeling frameworks for deep learning models. In this study, we propose two different modeling frameworks using Deep Long Short-Term Memory Neural Network (LSTM NN) model to predict future traffic state (speed and signal queue length). In our first...
Show moreDeep learning coupled with existing sensors based multiresolution traffic data and future connected technologies has immense potential to improve traffic operation and management. But to deal with complex transportation problems, we need efficient modeling frameworks for deep learning models. In this study, we propose two different modeling frameworks using Deep Long Short-Term Memory Neural Network (LSTM NN) model to predict future traffic state (speed and signal queue length). In our first problem, we present a modeling framework using deep LSTM NN model to predict traffic speeds in freeways during regular traffic condition as well as under extreme traffic demand, such as a hurricane evacuation. The approach is tested using real-world traffic data collected during hurricane Irma's evacuation for the interstate 75 (I-75), a major evacuation route in Florida. We perform several experiments for predicting speeds for 5 min, 10 min, and 15 min ahead of current time. The results are compared against other traditional prediction models such as K-Nearest Neighbor, Analytic Neural Network (ANN), Auto-Regressive Integrated Moving Average (ARIMA). We find that LSTM-NN performs better than these parametric and non-parametric models. Apart from the improvement in traffic operation, the proposed method can be integrated with evacuation traffic management systems for a better evacuation operation. In our second problem, we develop a data-driven real-time queue length prediction technique using deep LSTM NN model. We consider a connected corridor where information from vehicle detectors (located at the intersection) will be shared to consecutive intersections. We assume that the queue length of an intersection in the next cycle will depend on the queue length of the target and two upstream intersections in the current cycle. We use InSync Adaptive Traffic Control System (ATCS) data to train a Long Short-Term Memory Neural Network model capturing time-dependent patterns of a queue of a signal. To select the best combination of hyperparameters, we use sequential model-based optimization (SMBO) technique. Our experiment results show that the proposed modeling framework performs very well to predict the queue length. Although we run our experiments predicting the queue length for a single movement, the proposed method can be applied for other movements as well. Queue length prediction is a crucial part of an ATCS to optimize control parameters and this method can improve the existing signal optimization technique for ATCS.
Show less
-
Date Issued
-
2019
-
Identifier
-
CFE0007516, ucf:52654
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0007516
-
-
Title
-
A MODEL INTEGRATED MESHLESS SOLVER (MIMS) FOR FLUID FLOW AND HEAT TRANSFER.
-
Creator
-
Gerace, Salvadore, Kassab, Alain, University of Central Florida
-
Abstract / Description
-
Numerical methods for solving partial differential equations are commonplace in the engineering community and their popularity can be attributed to the rapid performance improvement of modern workstations and desktop computers. The ubiquity of computer technology has allowed all areas of engineering to have access to detailed thermal, stress, and fluid flow analysis packages capable of performing complex studies of current and future designs. The rapid pace of computer development, however,...
Show moreNumerical methods for solving partial differential equations are commonplace in the engineering community and their popularity can be attributed to the rapid performance improvement of modern workstations and desktop computers. The ubiquity of computer technology has allowed all areas of engineering to have access to detailed thermal, stress, and fluid flow analysis packages capable of performing complex studies of current and future designs. The rapid pace of computer development, however, has begun to outstrip efforts to reduce analysis overhead. As such, most commercially available software packages are now limited by the human effort required to prepare, develop, and initialize the necessary computational models. Primarily due to the mesh-based analysis methods utilized in these software packages, the dependence on model preparation greatly limits the accessibility of these analysis tools. In response, the so-called meshless or mesh-free methods have seen considerable interest as they promise to greatly reduce the necessary human interaction during model setup. However, despite the success of these methods in areas demanding high degrees of model adaptability (such as crack growth, multi-phase flow, and solid friction), meshless methods have yet to gain notoriety as a viable alternative to more traditional solution approaches in general solution domains. Although this may be due (at least in part) to the relative youth of the techniques, another potential cause is the lack of focus on developing robust methodologies. The failure to approach development from a practical perspective has prevented researchers from obtaining commercially relevant meshless methodologies which reach the full potential of the approach. The primary goal of this research is to present a novel meshless approach called MIMS (Model Integrated Meshless Solver) which establishes the method as a generalized solution technique capable of competing with more traditional PDE methodologies (such as the finite element and finite volume methods). This was accomplished by developing a robust meshless technique as well as a comprehensive model generation procedure. By closely integrating the model generation process into the overall solution methodology, the presented techniques are able to fully exploit the strengths of the meshless approach to achieve levels of automation, stability, and accuracy currently unseen in the area of engineering analysis. Specifically, MIMS implements a blended meshless solution approach which utilizes a variety of shape functions to obtain a stable and accurate iteration process. This solution approach is then integrated with a newly developed, highly adaptive model generation process which employs a quaternary triangular surface discretization for the boundary, a binary-subdivision discretization for the interior, and a unique shadow layer discretization for near-boundary regions. Together, these discretization techniques are able to achieve directionally independent, automatic refinement of the underlying model, allowing the method to generate accurate solutions without need for intermediate human involvement. In addition, by coupling the model generation with the solution process, the presented method is able to address the issue of ill-constructed geometric input (small features, poorly formed faces, etc.) to provide an intuitive, yet powerful approach to solving modern engineering analysis problems.
Show less
-
Date Issued
-
2010
-
Identifier
-
CFE0003299, ucf:48489
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0003299
-
-
Title
-
Characterization of a Spiking Neuron Model via a Linear Approach.
-
Creator
-
Jabalameli, Amirhossein, Behal, Aman, Hickman, James, Haralambous, Michael, University of Central Florida
-
Abstract / Description
-
In the past decade, characterizing spiking neuron models has been extensively researched as anessential issue in computational neuroscience. In this thesis, we examine the estimation problemof two different neuron models. In Chapter 2, We propose a modified Izhikevich model withan adaptive threshold. In our two-stage estimation approach, a linear least squares method anda linear model of the threshold are derived to predict the location of neuronal spikes. However,desired results are not...
Show moreIn the past decade, characterizing spiking neuron models has been extensively researched as anessential issue in computational neuroscience. In this thesis, we examine the estimation problemof two different neuron models. In Chapter 2, We propose a modified Izhikevich model withan adaptive threshold. In our two-stage estimation approach, a linear least squares method anda linear model of the threshold are derived to predict the location of neuronal spikes. However,desired results are not obtained and the predicted model is unsuccessful in duplicating the spikelocations. Chapter 3 is focused on the parameter estimation problem of a multi-timescale adaptivethreshold (MAT) neuronal model. Using the dynamics of a non-resetting leaky integrator equippedwith an adaptive threshold, a constrained iterative linear least squares method is implemented tofit the model to the reference data. Through manipulation of the system dynamics, the thresholdvoltage can be obtained as a realizable model that is linear in the unknown parameters. This linearlyparametrized realizable model is then utilized inside a prediction error based framework to identifythe threshold parameters with the purpose of predicting single neuron precise firing times. Thisestimation scheme is evaluated using both synthetic data obtained from an exact model as well asthe experimental data obtained from in vitro rat somatosensory cortical neurons. Results show theability of this approach to fit the MAT model to different types of reference data.
Show less
-
Date Issued
-
2015
-
Identifier
-
CFE0005958, ucf:50803
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0005958
-
-
Title
-
Field Evaluation of Insync Adaptive Traffic Signal Control System in Multiple Environments Using Multiple Approaches.
-
Creator
-
Shafik, Md Shafikul Islam, Radwan, Essam, Abou-Senna, Hatem, Eluru, Naveen, University of Central Florida
-
Abstract / Description
-
Since the beginning of signalization of intersections, the management of traffic congestion is one of most critical challenges specifically for the city and urbanized area. Almost all the municipal agencies struggle to manage the perplexities associated with traffic congestion or signal control. The Adaptive Traffic Control System (ATCS), an advanced and major technological component of the Intelligent Transportation Systems (ITS) is considered the most dynamic and real-time traffic...
Show moreSince the beginning of signalization of intersections, the management of traffic congestion is one of most critical challenges specifically for the city and urbanized area. Almost all the municipal agencies struggle to manage the perplexities associated with traffic congestion or signal control. The Adaptive Traffic Control System (ATCS), an advanced and major technological component of the Intelligent Transportation Systems (ITS) is considered the most dynamic and real-time traffic management technology and has potential to effectively manage rapidly varying traffic flow relative to the current state-of-the-art traffic management practices.InSync ATCS is deployed in multiple states throughout the US and expanding on a large scale. Although there had been several 'Measure of Effectiveness' studies performed previously, the performance of InSync is not unquestionable especially because the previous studies failed to subject for multiple environments, approaches, and variables. Most studies are accomplished through a single approach using simple/na(&)#239;ve before-after method without any control group/parameter. They also lacked ample statistical analysis, historical, maturation and regression artifacts. An attempt to evaluate the InSync ATCS in varying conditions through multiple approaches was undertaken for the SR-434 and Lake Underhill corridor in Orange County, Florida. A before-after study with an adjacent corridor as control group and volume as a control parameter has been performed where data of multiple variables were collected by three distinct procedures. The average/floating-car method was utilized as a rudimentary data collection process and 'BlueMac' and 'InSync' system database was considered as secondary data sources. Data collected for three times a day for weekdays and weekends before and after the InSync ATCS was deployed.Results show variation in both performance and scale. It proved ineffective in some of the cases, especially for the left turns, total intersection queue/delay and when the intersection volumes approach capacity. The results are verified through appropriate statistical analysis.
Show less
-
Date Issued
-
2017
-
Identifier
-
CFE0006915, ucf:51687
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0006915
-
-
Title
-
Investigating The Relationship Between Adverse Events and Infrastructure Development in an Active War Theater Using Soft Computing Techniques.
-
Creator
-
Cakit, Erman, Karwowski, Waldemar, Lee, Gene, Thompson, William, Mikusinski, Piotr, University of Central Florida
-
Abstract / Description
-
The military recently recognized the importance of taking sociocultural factors into consideration. Therefore, Human Social Culture Behavior (HSCB) modeling has been getting much attention in current and future operational requirements to successfully understand the effects of social and cultural factors on human behavior. There are different kinds of modeling approaches to the data that are being used in this field and so far none of them has been widely accepted. HSCB modeling needs the...
Show moreThe military recently recognized the importance of taking sociocultural factors into consideration. Therefore, Human Social Culture Behavior (HSCB) modeling has been getting much attention in current and future operational requirements to successfully understand the effects of social and cultural factors on human behavior. There are different kinds of modeling approaches to the data that are being used in this field and so far none of them has been widely accepted. HSCB modeling needs the capability to represent complex, ill-defined, and imprecise concepts, and soft computing modeling can deal with these concepts. There is currently no study on the use of any computational methodology for representing the relationship between adverse events and infrastructure development investments in an active war theater. This study investigates the relationship between adverse events and infrastructure development projects in an active war theater using soft computing techniques including fuzzy inference systems (FIS), artificial neural networks (ANNs), and adaptive neuro-fuzzy inference systems (ANFIS) that directly benefits from their accuracy in prediction applications. Fourteen developmental and economic improvement project types were selected based on allocated budget values and a number of projects at different time periods, urban and rural population density, and total adverse event numbers at previous month selected as independent variables. A total of four outputs reflecting the adverse events in terms of the number of people killed, wounded, hijacked, and total number of adverse events has been estimated. For each model, the data was grouped for training and testing as follows: years between 2004 and 2009 (for training purpose) and year 2010 (for testing). Ninety-six different models were developed and investigated for Afghanistan and the country was divided into seven regions for analysis purposes. Performance of each model was investigated and compared to all other models with the calculated mean absolute error (MAE) values and the prediction accuracy within (&)#177;1 error range (difference between actual and predicted value). Furthermore, sensitivity analysis was performed to determine the effects of input values on dependent variables and to rank the top ten input parameters in order of importance.According to the the results obtained, it was concluded that the ANNs, FIS, and ANFIS are useful modeling techniques for predicting the number of adverse events based on historical development or economic projects' data. When the model accuracy was calculated based on the MAE for each of the models, the ANN had better predictive accuracy than FIS and ANFIS models in general as demonstrated by experimental results. The percentages of prediction accuracy with values found within (&)#177;1 error range around 90%. The sensitivity analysis results show that the importance of economic development projects varies based on the regions, population density, and occurrence of adverse events in Afghanistan. For the purpose of allocating resources and development of regions, the results can be summarized by examining the relationship between adverse events and infrastructure development in an active war theater; emphasis was on predicting the occurrence of events and assessing the potential impact of regional infrastructure development efforts on reducing number of such events.
Show less
-
Date Issued
-
2013
-
Identifier
-
CFE0004826, ucf:49757
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0004826
-
-
Title
-
High-efficiency Blue Phase Liquid Crystal Displays.
-
Creator
-
Li, Yan, Wu, Shintson, Saleh, Bahaa, Zeldovich, Boris, Wu, Xinzhang, University of Central Florida
-
Abstract / Description
-
Blue phase liquid crystals (BPLCs) have a delicate lattice structure existing between chiral nematic and isotropic phases, with a stable temperature range of about 2 K. But due to short coherent length, these self-assembled nano-structured BPLCs have a fast response time. In the past three decades, the application of BPLC has been rather limited because of its narrow temperature range. In 2002, Kikuchi et al. developed a polymer stabilization method to extend the blue-phase temperature range...
Show moreBlue phase liquid crystals (BPLCs) have a delicate lattice structure existing between chiral nematic and isotropic phases, with a stable temperature range of about 2 K. But due to short coherent length, these self-assembled nano-structured BPLCs have a fast response time. In the past three decades, the application of BPLC has been rather limited because of its narrow temperature range. In 2002, Kikuchi et al. developed a polymer stabilization method to extend the blue-phase temperature range to more than 60 K. This opens a new gateway for display and photonic applications.In this dissertation, I investigate the material properties of polymer-stabilized BPLCs. According the Gerber's model, the Kerr constant of a BPLC is linearly proportional to the dielectric anisotropy of the LC host. Therefore, in the frequency domain, the relaxation of the Kerr constant follows the same trend as the dielectric relaxation of the host LC. I have carried out experiments to validate the theoretical predictions, and proposed a model called extended Cole-Cole model to describe the relaxation of the Kerr constant. On the other hand, because of the linear relationship, the Kerr constant should have the same sign as the dielectric anisotropy of the LC host; that is, a positive or negative Kerr constant results from positive or negative host LCs, respectively. BPLCs with a positive Kerr constant have been studied extensively, but there has been no study on negative polymer-stabilized BPLCs. Therefore, I have prepared a BPLC mixture using a negative dielectric anisotropy LC host and investigated its electro-optic properties. I have demonstrated that indeed the induced birefringence and Kerr constant are of negative sign. Due to the fast response time of BPLCs, color sequential display is made possible without color breakup. By removing the spatial color filters, the optical efficiency and resolution density are both tripled. With other advantages such as alignment free and wide viewing angle, polymer-stabilized BPLC is emerging as a promising candidate for next-generation displays.However, the optical efficiency of the BPLC cell is relatively low and the operating voltage is quite high using conventional in-plane-switching electrodes. I have proposed several device structures for improving the optical efficiency of transmissive BPLC cells. Significant improvement in transmittance is achieved by using enhanced protrusion electrodes, and a 100% transmittance is achievable using complementary enhanced protrusion electrode structure.For a conventional transmissive blue phase LCD, although it has superb performances indoor, when exposed to strong sunlight the displayed images could be washed out, leading to a degraded contrast ratio and readability. To overcome the sunlight readability problem, a common approach is to adaptively boost the backlight intensity, but the tradeoff is in the increased power consumption. Here, I have proposed a transflective blue phase LCD where the backlight is turned on in dark surroundings while ambient light is used to illuminate the displayed images in bright surroundings. Therefore, a good contrast ratio is preserved even for a strong ambient. I have proposed two transflective blue phase LCD structures, both of which have single cell gap, single gamma driving, reasonably wide view angle, low power consumption, and high optical efficiency. Among all the 3D technologies, integral imaging is an attractive approach due to its high efficiency and real image depth. However, the optimum observation distance should be adjusted as the displayed image depth changes. This requires a fast focal length change of an adaptive lens array. BPLC adaptive lenses are a good candidate because of their intrinsic fast response time. I have proposed several BPLC lens structures which are polarization independent and exhibit a parabolic phase profile in addition to fast response time.To meet the low power consumption requirement set by Energy Star, high optical efficiency is among the top lists of next-generation LCDs. In this dissertation, I have demonstrated some new device structures for improving the optical efficiency of a polymer-stabilized BPLC transmissive display and proposed sunlight readable transflective blue-phase LCDs by utilizing ambient light to reduce the power consumption. Moreover, we have proposed several blue-phase LC adaptive lenses for high efficiency 3D displays.
Show less
-
Date Issued
-
2012
-
Identifier
-
CFE0004787, ucf:49725
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0004787
-
-
Title
-
Autonomous Recovery of Reconfigurable Logic Devices using Priority Escalation of Slack.
-
Creator
-
Imran, Syednaveed, DeMara, Ronald, Mikhael, Wasfy, Lin, Mingjie, Yuan, Jiann-Shiun, Geiger, Christopher, University of Central Florida
-
Abstract / Description
-
Field Programmable Gate Array (FPGA) devices offer a suitable platform for survivable hardware architectures in mission-critical systems. In this dissertation, active dynamic redundancy-based fault-handling techniques are proposed which exploit the dynamic partial reconfiguration capability of SRAM-based FPGAs. Self-adaptation is realized by employing reconfiguration in detection, diagnosis, and recovery phases.To extend these concepts to semiconductor aging and process variation in the deep...
Show moreField Programmable Gate Array (FPGA) devices offer a suitable platform for survivable hardware architectures in mission-critical systems. In this dissertation, active dynamic redundancy-based fault-handling techniques are proposed which exploit the dynamic partial reconfiguration capability of SRAM-based FPGAs. Self-adaptation is realized by employing reconfiguration in detection, diagnosis, and recovery phases.To extend these concepts to semiconductor aging and process variation in the deep submicron era, resilient adaptable processing systems are sought to maintain quality and throughput requirements despite the vulnerabilities of the underlying computational devices. A new approach to autonomous fault-handling which addresses these goals is developed using only a uniplex hardware arrangement. It operates by observing a health metric to achieve Fault Demotion using Reconfigurable Slack (FaDReS). Here an autonomous fault isolation scheme is employed which neither requires test vectors nor suspends the computational throughput, but instead observes the value of a health metric based on runtime input. The deterministic flow of the fault isolation scheme guarantees success in a bounded number of reconfigurations of the FPGA fabric.FaDReS is then extended to the Priority Using Resource Escalation (PURE) online redundancy scheme which considers fault-isolation latency and throughput trade-offs under a dynamic spare arrangement. While deep-submicron designs introduce new challenges, use of adaptive techniques are seen to provide several promising avenues for improving resilience. The scheme developed is demonstrated by hardware design of various signal processing circuits and their implementation on a Xilinx Virtex-4 FPGA device. These include a Discrete Cosine Transform (DCT) core, Motion Estimation (ME) engine, Finite Impulse Response (FIR) Filter, Support Vector Machine (SVM), and Advanced Encryption Standard (AES) blocks in addition to MCNC benchmark circuits. A significant reduction in power consumption is achieved ranging from 83% for low motion-activity scenes to 12.5% for high motion activity video scenes in a novel ME engine configuration. For a typical benchmark video sequence, PURE is shown to maintain a PSNR baseline near 32dB. The diagnosability, reconfiguration latency, and resource overhead of each approach is analyzed. Compared to previous alternatives, PURE maintains a PSNR within a difference of 4.02dB to 6.67dB from the fault-free baseline by escalating healthy resources to higher-priority signal processing functions. The results indicate the benefits of priority-aware resiliency over conventional redundancy approaches in terms of fault-recovery, power consumption, and resource-area requirements. Together, these provide a broad range of strategies to achieve autonomous recovery of reconfigurable logic devices under a variety of constraints, operating conditions, and optimization criteria.
Show less
-
Date Issued
-
2013
-
Identifier
-
CFE0005006, ucf:50005
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0005006
-
-
Title
-
The Challenges of Young-Typed Jobs and How Older Workers Adapt.
-
Creator
-
Reeves, Michael, Fritzsche, Barbara, Dipboye, Robert, Matusitz, Jonathan, University of Central Florida
-
Abstract / Description
-
This study sought to explore the challenges faced by older workers who do not fit the age-type of their jobs and how older workers adapt to overcome those challenges. Specifically, I surveyed a national sample of 227 workers 50 years of age and older, in a wide variety of jobs, on measures of perceived age discrimination and adaptation behaviors. I found that fit, as determined by career timetables theory, but not prototype matching theory, successfully predicted perceived age discrimination....
Show moreThis study sought to explore the challenges faced by older workers who do not fit the age-type of their jobs and how older workers adapt to overcome those challenges. Specifically, I surveyed a national sample of 227 workers 50 years of age and older, in a wide variety of jobs, on measures of perceived age discrimination and adaptation behaviors. I found that fit, as determined by career timetables theory, but not prototype matching theory, successfully predicted perceived age discrimination. Specifically, more age discrimination was perceived when fewer older workers occupied a job. Additionally, multiple regression analysis showed that career timetables theory, prototype matching theory, and measures of perceived discrimination interacted to predict adaptation behaviors. That is, older workers made more efforts appear younger at work when they perceived age discrimination in jobs occupied by fewer older workers and older women expressed greater desires to appear younger at work when they perceived age discrimination in jobs viewed as more appropriate for younger workers. Although older workers made a wide variety of efforts to appear younger at work, from changing the way they dressed to undergoing surgical procedures, the adaptation efforts believed to be the most effective against age discrimination were more oriented toward enhancing job performance than one's appearance. It is especially troubling that greater perceived age discrimination was found in young-typed jobs (than in old-typed jobs) given that the number of older workers occupying young-typed jobs is expected to rapidly grow in the near future and perceived discrimination is associated with mental and physical consequences for older adults. Understanding effective adaptations to age discrimination is a valuable first step in helping older workers overcome the disadvantages they may face in the workplace, especially when they occupy young-typed jobs. Implications for theory and research are discussed.
Show less
-
Date Issued
-
2013
-
Identifier
-
CFE0005050, ucf:49947
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0005050
-
-
Title
-
Evolution and distribution of phenotypic diversity in the venom of Mojave Rattlesnakes (Crotalus scutulatus).
-
Creator
-
Strickland, Jason, Savage, Anna, Parkinson, Christopher, Hoffman, Eric, Rokyta, Darin, University of Central Florida
-
Abstract / Description
-
Intraspecific phenotype diversity allows for local adaption and the ability for species to respond to changing environmental conditions, enhancing survivability. Phenotypic variation could be stochastic, genetically based, and/or the result of different environmental conditions. Mojave Rattlesnakes, Crotalus scutulatus, are known to have high intraspecific venom variation, but the geographic extent of the variation and factors influencing venom evolution are poorly understood. Three primary...
Show moreIntraspecific phenotype diversity allows for local adaption and the ability for species to respond to changing environmental conditions, enhancing survivability. Phenotypic variation could be stochastic, genetically based, and/or the result of different environmental conditions. Mojave Rattlesnakes, Crotalus scutulatus, are known to have high intraspecific venom variation, but the geographic extent of the variation and factors influencing venom evolution are poorly understood. Three primary venom types have been described in this species based on the presence (Type A) or absence (Type B) of a neurotoxic phospholipase A2 called Mojave toxin and an inverse relationship with the presence of snake venom metalloproteinases (SVMPs). Individuals that contain both Mojave toxin and SVMPs, although rare, are the third, and designated Type A + B. I sought to describe the proteomic and transcriptomic venom diversity of C. scutulatus across its range and test whether diversity was correlated with genetic or environmental differences. This study includes the highest geographic sampling of Mojave Rattlesnakes and includes the most venom-gland transcriptomes known for one species. Of the four mitochondrial lineages known, only one was monophyletic for venom type. Environmental variables poorly correlated with the phenotypes. Variability in toxin and toxin family composition of venom transcriptomes was largely due to differences in transcript expression. Four of 19 toxin families identified in C. scutulatus account for the majority of differences in toxin number and expression variation. I was able to determine that the toxins primarily responsible for venom types are inherited in a Mendelian fashion and that toxin expression is additive when comparing heterozygotes and homozygotes. Using the genetics to define venom type is more informative and the Type A + B phenotype is not unique, but rather heterozygous for the PLA2 and/or SVMP alleles. Intraspecific venom variation in C. scutulatus highlights the need for fine scale ecological and natural history information to understand how phenotypic diversity is generated and maintained geographically through time.
Show less
-
Date Issued
-
2018
-
Identifier
-
CFE0007252, ucf:52198
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0007252
-
-
Title
-
UTILIZING EDGE IN IOT AND VIDEO STREAMING APPLICATIONS TO REDUCE BOTTLENECKS IN INTERNET TRAFFIC.
-
Creator
-
Akpinar, Kutalmis, Hua, Kien, Zou, Changchun, Turgut, Damla, Wang, Jun, University of Central Florida
-
Abstract / Description
-
There is a large increase in the surge of data over Internet due to the increasing demand on multimedia content. It is estimated that 80% of Internet traffic will be video by 2022, according to a recent study. At the same time, IoT devices on Internet will double the human population. While infrastructure standards on IoT are still nonexistent, enterprise solutions tend to encourage cloud-based solutions, causing an additional surge of data over the Internet. This study proposes solutions to...
Show moreThere is a large increase in the surge of data over Internet due to the increasing demand on multimedia content. It is estimated that 80% of Internet traffic will be video by 2022, according to a recent study. At the same time, IoT devices on Internet will double the human population. While infrastructure standards on IoT are still nonexistent, enterprise solutions tend to encourage cloud-based solutions, causing an additional surge of data over the Internet. This study proposes solutions to bring video traffic and IoT computation back to the edges of the network, so that costly Internet infrastructure upgrades are not necessary. An efficient way to prevent the Internet surge over the network for IoT is to push the application specific computation to the edge of the network, close to where the data is generated, so that large data can be eliminated before being delivered to the cloud. In this study, an event query language and processing environment is provided to process events from various devices. The query processing environment brings the application developers, sensor infrastructure providers and end users together. It uses boolean events as the streaming and processing units. This addresses the device heterogeneity and pushes the data-intense tasks to the edge of network.The second focus of the study is Video-on-Demand applications. A characteristic of VoD traffic is its high redundancy. Due to the demand on popular content, the same video traffic flows through Internet Service Provider's network as overlapping but separate streams. In previous studies on redundancy elimination, overlapping streams are merged into each other in link-level by receiving the packet only for the first stream, and re-using it for the subsequent duplicated streams. In this study, we significantly improve these techniques by introducing a merger-aware routing method.Our final focus is increasing utilization of Content Delivery Network (CDN) servers on the edge of network to reduce the long-distance traffic. The proposed system uses Software Defined Networks (SDN) to route adaptive video streaming clients to the best available CDN servers in terms of network availability. While performing the network assistance, the system does not reveal the video request information to the network provider, thus enabling privacy protection for encrypted streams. The request routing is performed in segment level for adaptive streaming. This enables to re-route the client to the best available CDN without an interruption if network conditions change during the stream.
Show less
-
Date Issued
-
2019
-
Identifier
-
CFE0007882, ucf:52774
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0007882
-
-
Title
-
Defining Effective Teacher Practices among Students with Emotional Behavioral Disabilities.
-
Creator
-
Mayes, Zerek, Martin, Suzanne, Boote, David, Butler, S. Kent, Berrio, Gabriel, University of Central Florida
-
Abstract / Description
-
This phenomenological study examined the lived experiences of special education teachers who worked with students with emotional behavioral disabilities (EBD) across various urban settings and educative environments. Given that the overall percentage of students receiving special education services has increased, the overall percentage of students with EBD served among all school-aged children and youth has remained below 1% (U.S. Department of Education, National Center for Education...
Show moreThis phenomenological study examined the lived experiences of special education teachers who worked with students with emotional behavioral disabilities (EBD) across various urban settings and educative environments. Given that the overall percentage of students receiving special education services has increased, the overall percentage of students with EBD served among all school-aged children and youth has remained below 1% (U.S. Department of Education, National Center for Education Statistics, 2018). The current failings of reform efforts to improve the academic achievement of students with EBD brings the roles, responsibilities and practices of teachers and their preparation into view. This study examined the impact of culture on the attitudes, beliefs, and practices of special education teachers. Semi-structured interviews were conducted with eight participants (N = 8). A thematic analysis resulted in three overarching themes. The three themes included: (a) the essentials: keys to student engagement, (b) the frustrations regarding effective program implementation, and (c) elements of an effective program. This study exposed multiple factors affecting the effectiveness of special educators' practices as well as offered recommendations for teachers, schools, districts, policies, and future research.
Show less
-
Date Issued
-
2019
-
Identifier
-
CFE0007682, ucf:52510
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0007682
-
-
Title
-
Specialty Fiber Lasers and Novel Fiber Devices.
-
Creator
-
Jollivet, Clemence, Schulzgen, Axel, Moharam, Jim, Richardson, Martin, Mafi, Arash, University of Central Florida
-
Abstract / Description
-
At the Dawn of the 21st century, the field of specialty optical fibers experienced a scientific revolution with the introduction of the stack-and-draw technique, a multi-steps and advanced fiber fabrication method, which enabled the creation of well-controlled micro-structured designs. Since then, an extremely wide variety of finely tuned fiber structures have been demonstrated including novel materials and novel designs. As the complexity of the fiber design increased, highly-controlled...
Show moreAt the Dawn of the 21st century, the field of specialty optical fibers experienced a scientific revolution with the introduction of the stack-and-draw technique, a multi-steps and advanced fiber fabrication method, which enabled the creation of well-controlled micro-structured designs. Since then, an extremely wide variety of finely tuned fiber structures have been demonstrated including novel materials and novel designs. As the complexity of the fiber design increased, highly-controlled fabrication processes became critical. To determine the ability of a novel fiber design to deliver light with properties tailored according to a specific application, several mode analysis techniques were reported, addressing the recurring needs for in-depth fiber characterization. The first part of this dissertation details a novel experiment that was demonstrated to achieve modal decomposition with extended capabilities, reaching beyond the limits set by the existing mode analysis techniques. As a result, individual transverse modes carrying between ~0.01% and ~30% of the total light were resolved with unmatched accuracy. Furthermore, this approach was employed to decompose the light guided in Large-Mode Area (LMA) fiber, Photonic Crystal Fiber (PCF) and Leakage Channel Fiber (LCF). The single-mode performances were evaluated and compared. As a result, the suitability of each specialty fiber design to be implemented for power-scaling applications of fiber laser systems was experimentally determined.The second part of this dissertation is dedicated to novel specialty fiber laser systems. First, challenges related to the monolithic integration of novel and complex specialty fiber designs in all-fiber systems were addressed. The poor design and size compatibility between specialty fibers and conventional fiber-based components limits their monolithic integration due to high coupling loss and unstable performances. Here, novel all-fiber Mode-Field Adapter (MFA) devices made of selected segments of Graded Index Multimode Fiber (GIMF) were implemented to mitigate the coupling losses between a LMA PCF and a conventional Single-Mode Fiber (SMF), presenting an initial 18-fold mode-field area mismatch. It was experimentally demonstrated that the overall transmission in the mode-matched fiber chain was increased by more than 11 dB (the MFA was a 250 ?m piece of 50 ?m core diameter GIMF). This approach was further employed to assemble monolithic fiber laser cavities combining an active LMA PCF and fiber Bragg gratings (FBG) in conventional SMF. It was demonstrated that intra-cavity mode-matching results in an efficient (60%) and narrow-linewidth (200 pm) laser emission at the FBG wavelength.In the last section of this dissertation, monolithic Multi-Core Fiber (MCF) laser cavities were reported for the first time. Compared to existing MCF lasers, renown for high-brightness beam delivery after selection of the in-phase supermode, the present new generation of 7-coupled-cores Yb-doped fiber laser uses the gain from several supermodes simultaneously. In order to uncover mode competition mechanisms during amplification and the complex dynamics of multi-supermode lasing, novel diagnostic approaches were demonstrated. After characterizing the laser behavior, the first observations of self-mode-locking in linear MCF laser cavities were discovered.
Show less
-
Date Issued
-
2014
-
Identifier
-
CFE0005354, ucf:50491
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0005354
-
-
Title
-
Dual Branding: An Investigative Look into Dual Branding's Position within the Concept of Brand Alliance in the Hotel Industry.
-
Creator
-
Ronzoni, Giulio, Fyall, Alan, Torres Areizaga, Edwin, Singh, Dipendra, Weinland, Jeffrey, Smith, Scott, University of Central Florida
-
Abstract / Description
-
The purpose of this dissertation was to investigate, in an exploratory way, the state of the art of the application of brand alliances, with a particular focus on the practice of dual branding, in the field of lodging. More precisely, this research aimed at identifying and evaluating the determinants of industry adoption of, and customer satisfaction with, intra-company dual branding strategies in the US lodging industry.The primary purpose of this study is to determine the efficacy of dual...
Show moreThe purpose of this dissertation was to investigate, in an exploratory way, the state of the art of the application of brand alliances, with a particular focus on the practice of dual branding, in the field of lodging. More precisely, this research aimed at identifying and evaluating the determinants of industry adoption of, and customer satisfaction with, intra-company dual branding strategies in the US lodging industry.The primary purpose of this study is to determine the efficacy of dual branding in the field of the lodging industry, a phenomenon that is still insufficiently explored in the literature. In fact, the scarcity of literature pertaining to the lodging industry has forced this study to consider the research related to other segments and industries where dual branding strategies have been studied. Therefore, this study intended to expand the existing body of knowledge, advancing the theory of brand alliance from an industry and consumer perspective, as well as adapt, refine, and utilize a scale suitable for the measurement of dual branded hotels' customer satisfaction. This dissertation used an exploratory sequential mixed method approach. In the first qualitative phase, face-to-face and telephone interviews with operational hotel managers, corporate hotel managers, real estate development and management companies' managers, owners, and presidents, as well as hotel and lodging associations' professionals have been conducted. In addition to relevant and significant findings and results obtained through the hotel industry professionals interviewed, themes, constructs, and variables useful in the refinement and adaptation of a dual branding customer satisfaction scale were attained. Consequently, the second quantitative phase consisted of an online administration of a scenario-based questionnaire to dual branded hotels' customers of a dual branded lodging property aimed at identifying and evaluating the determinants of customer satisfaction.The ultimate purpose of this research has been to understand the main issues of implementation of dual branding practices and strategies in the lodging context. In particular, it has been to highlight and provide managerial, theoretical, methodological, and practical implications and recommendations for the US lodging industry, in the adoption of intra-company dual branding strategies. The suggestions offered in the study are relevantly timed to what is happening within the lodging industry, offering implications for both academia and industry.
Show less
-
Date Issued
-
2019
-
Identifier
-
CFE0007716, ucf:52411
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0007716
-
-
Title
-
The Continuing Anglican Metamorphosis: Introducing the Adapted Integrated Model.
-
Creator
-
L'Hommedieu, John, Gay, David, Grauerholz, Elizabeth, Carter, Shannon, University of Central Florida
-
Abstract / Description
-
The purpose of this thesis is to develop and test the Advanced Integrated Model, a typological model in the tradition of Weber's interpretive sociology, as an asset in explaining recent transformations in American Episcopal-Anglican organizations. The study includes an assessment of the church-sect tradition in the sociology of religion and a summary overview of Weber's interpretive sociology with special emphasis on the nature and construction of ideal-types and their use in analysis. To...
Show moreThe purpose of this thesis is to develop and test the Advanced Integrated Model, a typological model in the tradition of Weber's interpretive sociology, as an asset in explaining recent transformations in American Episcopal-Anglican organizations. The study includes an assessment of the church-sect tradition in the sociology of religion and a summary overview of Weber's interpretive sociology with special emphasis on the nature and construction of ideal-types and their use in analysis. To illustrate the effectiveness of the model a number of institutional rivalries confronting contemporary Episcopal-Anglican organizations are identified and shown to be explainable only from a sociological perspective and not simply as (")in house(") institutional problems. The present work sheds light on parent-child conflicts in religious organizations and reopens discussion about the theoretical value of ideal-types in general, and church-sect typologies in particular, when utilized from a comparative-historical perspective.
Show less
-
Date Issued
-
2012
-
Identifier
-
CFE0004565, ucf:49209
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0004565
-
-
Title
-
Adaptive Architectural Strategies for Resilient Energy-Aware Computing.
-
Creator
-
Ashraf, Rizwan, DeMara, Ronald, Lin, Mingjie, Wang, Jun, Jha, Sumit, Johnson, Mark, University of Central Florida
-
Abstract / Description
-
Reconfigurable logic or Field-Programmable Gate Array (FPGA) devices have the ability to dynamically adapt the computational circuit based on user-specified or operating-condition requirements. Such hardware platforms are utilized in this dissertation to develop adaptive techniques for achieving reliable and sustainable operation while autonomously meeting these requirements. In particular, the properties of resource uniformity and in-field reconfiguration via on-chip processors are exploited...
Show moreReconfigurable logic or Field-Programmable Gate Array (FPGA) devices have the ability to dynamically adapt the computational circuit based on user-specified or operating-condition requirements. Such hardware platforms are utilized in this dissertation to develop adaptive techniques for achieving reliable and sustainable operation while autonomously meeting these requirements. In particular, the properties of resource uniformity and in-field reconfiguration via on-chip processors are exploited to implement Evolvable Hardware (EHW). EHW utilize genetic algorithms to realize logic circuits at runtime, as directed by the objective function. However, the size of problems solved using EHW as compared with traditional approaches has been limited to relatively compact circuits. This is due to the increase in complexity of the genetic algorithm with increase in circuit size. To address this research challenge of scalability, the Netlist-Driven Evolutionary Refurbishment (NDER) technique was designed and implemented herein to enable on-the-fly permanent fault mitigation in FPGA circuits. NDER has been shown to achieve refurbishment of relatively large sized benchmark circuits as compared to related works. Additionally, Design Diversity (DD) techniques which are used to aid such evolutionary refurbishment techniques are also proposed and the efficacy of various DD techniques is quantified and evaluated.Similarly, there exists a growing need for adaptable logic datapaths in custom-designed nanometer-scale ICs, for ensuring operational reliability in the presence of Process, Voltage, and Temperature (PVT) and, transistor-aging variations owing to decreased feature sizes for electronic devices. Without such adaptability, excessive design guardbands are required to maintain the desired integration and performance levels. To address these challenges, the circuit-level technique of Self-Recovery Enabled Logic (SREL) was designed herein. At design-time, vulnerable portions of the circuit identified using conventional Electronic Design Automation tools are replicated to provide post-fabrication adaptability via intelligent techniques. In-situ timing sensors are utilized in a feedback loop to activate suitable datapaths based on current conditions that optimize performance and energy consumption. Primarily, SREL is able to mitigate the timing degradations caused due to transistor aging effects in sub-micron devices by reducing the stress induced on active elements by utilizing power-gating. As a result, fewer guardbands need to be included to achieve comparable performance levels which leads to considerable energy savings over the operational lifetime.The need for energy-efficient operation in current computing systems has given rise to Near-Threshold Computing as opposed to the conventional approach of operating devices at nominal voltage. In particular, the goal of exascale computing initiative in High Performance Computing (HPC) is to achieve 1 EFLOPS under the power budget of 20MW. However, it comes at the cost of increased reliability concerns, such as the increase in performance variations and soft errors. This has given rise to increased resiliency requirements for HPC applications in terms of ensuring functionality within given error thresholds while operating at lower voltages. My dissertation research devised techniques and tools to quantify the effects of radiation-induced transient faults in distributed applications on large-scale systems. A combination of compiler-level code transformation and instrumentation are employed for runtime monitoring to assess the speed and depth of application state corruption as a result of fault injection. Finally, fault propagation models are derived for each HPC application that can be used to estimate the number of corrupted memory locations at runtime. Additionally, the tradeoffs between performance and vulnerability and the causal relations between compiler optimization and application vulnerability are investigated.
Show less
-
Date Issued
-
2015
-
Identifier
-
CFE0006206, ucf:52889
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0006206
Pages