Current Search: functionalization (x)
View All Items
Pages
- Title
- Effect of Nonclassical Optical Turbulence on a Propagating Laser Beam.
- Creator
-
Beason, Melissa, Phillips, Ronald, Atia, George, Richardson, Martin, Andrews, Larry, Shivamoggi, Bhimsen, University of Central Florida
- Abstract / Description
-
Theory developed for the propagation of a laser beam through optical turbulence generally assumes that the turbulence is both homogeneous and isotropic and that the associated spectrum follows the classical Kolmogorov spectral power law of . If the atmosphere deviates from these assumptions, beam statistics such as mean intensity, correlation, and scintillation index could vary significantly from mathematical predictions. This work considers the effect of nonclassical turbulence on a...
Show moreTheory developed for the propagation of a laser beam through optical turbulence generally assumes that the turbulence is both homogeneous and isotropic and that the associated spectrum follows the classical Kolmogorov spectral power law of . If the atmosphere deviates from these assumptions, beam statistics such as mean intensity, correlation, and scintillation index could vary significantly from mathematical predictions. This work considers the effect of nonclassical turbulence on a propagated beam. Namely, anisotropy of the turbulence and a power law that deviates from . A mathematical model is developed for the scintillation index of a Gaussian beam propagated through nonclassical turbulence and theory is extended for the covariance function of intensity of a plane wave propagated through nonclassical turbulence. Multiple experiments over a concrete runway and a grass range verify the presence of turbulence which varies between isotropy and anisotropy. Data is taken throughout the day and the evolution of optical turbulence is considered. Also, irradiance fluctuation data taken in May 2018 over a concrete runway and July 2018 over a grass range indicate an additional beam shaping effect. A simplistic mathematical model was formulated which reproduced the measured behavior of contours of equal mean intensity and scintillation index.?
Show less - Date Issued
- 2018
- Identifier
- CFE0007310, ucf:52646
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007310
- Title
- Kindergarten is Not Child's Play: An Exploration of Pedagogical Approaches Related to Learning in a Play-Based and a Contemporary Classroom at a Title I Elementary School.
- Creator
-
Allee-Herndon, Karyn, Roberts, Sherron, Lue, Martha, Clark, M. H., Garcia, Jeanette, Hu, Bi Ying, University of Central Florida
- Abstract / Description
-
This dissertation is divided into three separate, related, naturalistic, quasi-experimental research studies, all using data from two kindergarten classes at Gator Elementary, a public Title I elementary school in Sunshine District in Central Florida. Each of these studies tested hypotheses that kindergarten children, especially those from low socioeconomic backgrounds, will show greater gains in receptive vocabulary, executive function, and academic achievement when purposeful play is used...
Show moreThis dissertation is divided into three separate, related, naturalistic, quasi-experimental research studies, all using data from two kindergarten classes at Gator Elementary, a public Title I elementary school in Sunshine District in Central Florida. Each of these studies tested hypotheses that kindergarten children, especially those from low socioeconomic backgrounds, will show greater gains in receptive vocabulary, executive function, and academic achievement when purposeful play is used as a pedagogical approach than similar children in typical, contemporary kindergarten classrooms. The first study explored the effects of play-based and contemporary pedagogical approaches on students' receptive vocabulary using the PPVT-4, the second explored students' executive functions using the BRIEF2, and the third explored students' movements using Actigraph GT9X Link accelerometers. All three studies analyzed these data in relation to students' academic achievement as measured by i-Ready Diagnostic assessments. Statistically significant differences were detected in students' receptive vocabulary and reading growth as well as statistically significant differences in students' executive function health as reported by teachers and reading and math academic growth by classroom conditions. A strong association between receptive vocabulary and reading performances was revealed alongside strong negative correlations between levels of executive function concern and reading performance. No statistical differences in math growth between classrooms were found, although there was a moderate effect size, and less of an association between math performance and executive function presented. While strong correlations between academic achievement and total movement by day or movement types were revealed, these associations were inconsistent. Nor were there significant differences in movement by classroom conditions, although there was a moderate effect size suggesting some differences in movement by condition. The findings from this dissertation, while limited, point to a bourgeoning area of research connecting neuroscientific findings with developmentally appropriate practices to explore effective interventions to increase educational equity for vulnerable students.
Show less - Date Issued
- 2019
- Identifier
- CFE0007596, ucf:52556
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007596
- Title
- The Effects of Phosphatidylserine on Reaction Time and Cognitive Function Following an Exercise Stress.
- Creator
-
Wells, Adam, Hoffman, Jay, Fragala, Maren, Stout, Jeffrey, University of Central Florida
- Abstract / Description
-
Phosphatidylserine (PS) is an endogenously occurring phospholipid that has been shown to have cognition and mood enhancing properties in humans, possibly through its role as an enzyme co-factor in cellular signal transduction. Specifically, PS has been identified as activator of classical isoforms of protein kinase C, an enzyme known to be involved in the growth and differentiation of neural cells, and is therefore thought to play a role in the protection of neurons.The purpose of this study...
Show morePhosphatidylserine (PS) is an endogenously occurring phospholipid that has been shown to have cognition and mood enhancing properties in humans, possibly through its role as an enzyme co-factor in cellular signal transduction. Specifically, PS has been identified as activator of classical isoforms of protein kinase C, an enzyme known to be involved in the growth and differentiation of neural cells, and is therefore thought to play a role in the protection of neurons.The purpose of this study was to examine the effects of supplementation with PS and caffeine on measures of cognition, reaction time and mood prior to and following an exercise stress. Twenty, healthy, resistance trained males (17) and females (3) (mean (&)#177; SD; age: 22.75 (&)#177; 3.27 yrs; height: 177.03 (&)#177; 8.44cm; weight: 78.98 (&)#177; 11.24kg; body fat%: 14.28 (&)#177; 6.6), volunteered to participate in this randomized, double-blind, placebo-controlled study. Participants were assigned to a PS group (400mg/day PS; 100mg/day caffeine, N=9) or PL (16g/day Carbs, N=11) delivered in the form of 4 candy chews identical in size, shape and color. Subjects performed an acute bout of full body resistance exercise, prior to (T1) and following 14 days of supplementation (T2). Measures of reaction time (Dynavision(&)#174; D2 Visuomotor Training Device), cognition (Serial Subtraction Test, SST), and mood (Profile of Mood States, POMS) were assessed immediately before and following resistance exercise in both T1 and T2. Data was analyzed using two-way ANCOVA and repeated measures ANOVA.Supplementation with 400mg PS and 100mg caffeine did not have a significant impact upon measures of reaction time or cognition between groups at baseline or following acute resistance exercise. However, there was a non-significant trend to the attenuation of fatigue between groups, following acute resistance exercise (p = 0.071). Interestingly, our data suggests that acute resistance exercise alone may improve cognitive function.Although more research is necessary regarding optimal dosage and supplementation duration, the current findings suggest that supplementation 400mg/day PS with 100mg/day caffeine may attenuate fatigue following acute resistance exercise. It is possible that the lack of significance may be the result of both an inhibition of the PS activated pathway and a withdrawal effect from caffeine.
Show less - Date Issued
- 2012
- Identifier
- CFE0004457, ucf:49325
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004457
- Title
- A MODEL INTEGRATED MESHLESS SOLVER (MIMS) FOR FLUID FLOW AND HEAT TRANSFER.
- Creator
-
Gerace, Salvadore, Kassab, Alain, University of Central Florida
- Abstract / Description
-
Numerical methods for solving partial differential equations are commonplace in the engineering community and their popularity can be attributed to the rapid performance improvement of modern workstations and desktop computers. The ubiquity of computer technology has allowed all areas of engineering to have access to detailed thermal, stress, and fluid flow analysis packages capable of performing complex studies of current and future designs. The rapid pace of computer development, however,...
Show moreNumerical methods for solving partial differential equations are commonplace in the engineering community and their popularity can be attributed to the rapid performance improvement of modern workstations and desktop computers. The ubiquity of computer technology has allowed all areas of engineering to have access to detailed thermal, stress, and fluid flow analysis packages capable of performing complex studies of current and future designs. The rapid pace of computer development, however, has begun to outstrip efforts to reduce analysis overhead. As such, most commercially available software packages are now limited by the human effort required to prepare, develop, and initialize the necessary computational models. Primarily due to the mesh-based analysis methods utilized in these software packages, the dependence on model preparation greatly limits the accessibility of these analysis tools. In response, the so-called meshless or mesh-free methods have seen considerable interest as they promise to greatly reduce the necessary human interaction during model setup. However, despite the success of these methods in areas demanding high degrees of model adaptability (such as crack growth, multi-phase flow, and solid friction), meshless methods have yet to gain notoriety as a viable alternative to more traditional solution approaches in general solution domains. Although this may be due (at least in part) to the relative youth of the techniques, another potential cause is the lack of focus on developing robust methodologies. The failure to approach development from a practical perspective has prevented researchers from obtaining commercially relevant meshless methodologies which reach the full potential of the approach. The primary goal of this research is to present a novel meshless approach called MIMS (Model Integrated Meshless Solver) which establishes the method as a generalized solution technique capable of competing with more traditional PDE methodologies (such as the finite element and finite volume methods). This was accomplished by developing a robust meshless technique as well as a comprehensive model generation procedure. By closely integrating the model generation process into the overall solution methodology, the presented techniques are able to fully exploit the strengths of the meshless approach to achieve levels of automation, stability, and accuracy currently unseen in the area of engineering analysis. Specifically, MIMS implements a blended meshless solution approach which utilizes a variety of shape functions to obtain a stable and accurate iteration process. This solution approach is then integrated with a newly developed, highly adaptive model generation process which employs a quaternary triangular surface discretization for the boundary, a binary-subdivision discretization for the interior, and a unique shadow layer discretization for near-boundary regions. Together, these discretization techniques are able to achieve directionally independent, automatic refinement of the underlying model, allowing the method to generate accurate solutions without need for intermediate human involvement. In addition, by coupling the model generation with the solution process, the presented method is able to address the issue of ill-constructed geometric input (small features, poorly formed faces, etc.) to provide an intuitive, yet powerful approach to solving modern engineering analysis problems.
Show less - Date Issued
- 2010
- Identifier
- CFE0003299, ucf:48489
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0003299
- Title
- COMPUTATIONAL STUDY OF THE NEAR FIELD SPONTANEOUS CREATION OF PHOTONIC STATES COUPLED TO FEW LEVEL SYSTEMS.
- Creator
-
Tafur, Sergio, Leuenberger, Michael, University of Central Florida
- Abstract / Description
-
Models of the spontaneous emission and absorption of photons coupled to the electronic states of quantum dots, molecules, N-V (single nitrogen vacancy) centers in diamond, that can be modeled as artificial few level atoms, are important to the development of quantum computers and quantum networks. A quantum source modeled after an effective few level system is strongly dependent on the type and coupling strength the allowed transitions. These selection rules are subject to the Wigner-Eckert...
Show moreModels of the spontaneous emission and absorption of photons coupled to the electronic states of quantum dots, molecules, N-V (single nitrogen vacancy) centers in diamond, that can be modeled as artificial few level atoms, are important to the development of quantum computers and quantum networks. A quantum source modeled after an effective few level system is strongly dependent on the type and coupling strength the allowed transitions. These selection rules are subject to the Wigner-Eckert theorem which specifies the possible transitions during the spontaneous creation of a photonic state and its subsequent emission. The model presented in this dissertation describes the spatio-temporal evolution of photonic states by means of a Dirac-like equation for the photonic wave function within the region of interaction of a quantum source. As part of this aim, we describe the possibility to shift from traditional electrodynamics and quantum electrodynamics, in terms of electric and magnetic fields, to one in terms of a photonic wave function and its operators. The mapping between these will also be presented herein. It is further shown that the results of this model can be experimentally verified. The suggested method of verification relies on the direct comparison of the calculated density matrix or Wigner function, associated with the quantum state of a photon, to ones that are experimentally reconstructed through optical homodyne tomography techniques. In this non-perturbative model we describe the spontaneous creation of photonic state in a non-Markovian limit which does not implement the Weisskopf-Wigner approximation. We further show that this limit is important for the description of how a single photonic mode is created from the possibly infinite set of photonic frequencies $\nu_k$ that can be excited in a dielectric-cavity from the vacuum state. We use discretized central-difference approximations to the space and time partial derivatives, similar to finite-difference time domain models, to compute these results. The results presented herein show that near field effects need considered when describing adjacent quantum sources that are separated by distances that are small with respect to the wavelength of their spontaneously created photonic states. Additionally, within the future scope of this model,we seek results in the Purcell and Rabi regimes to describe enhanced spontaneous emission events from these few-level systems, as embedded in dielectric cavities. A final goal of this dissertation is to create novel computational and theoretical models that describe single and multiple photon states via single photon creation and annihilation operators.
Show less - Date Issued
- 2011
- Identifier
- CFE0003881, ucf:48739
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0003881
- Title
- Exploration and development of crash modification factors and functions for single and multiple treatments.
- Creator
-
Park, Juneyoung, Abdel-Aty, Mohamed, Radwan, Essam, Eluru, Naveen, Wang, Chung-Ching, Lee, JaeYoung, University of Central Florida
- Abstract / Description
-
Traffic safety is a major concern for the public, and it is an important component of the roadway management strategy. In order to improve highway safety, extensive efforts have been made by researchers, transportation engineers, Federal, State, and local government officials. With these consistent efforts, both fatality and injury rates from road traffic crashes in the United States have been steadily declining over the last six years (2006~2011). However, according to the National Highway...
Show moreTraffic safety is a major concern for the public, and it is an important component of the roadway management strategy. In order to improve highway safety, extensive efforts have been made by researchers, transportation engineers, Federal, State, and local government officials. With these consistent efforts, both fatality and injury rates from road traffic crashes in the United States have been steadily declining over the last six years (2006~2011). However, according to the National Highway Traffic Safety Administration (NHTSA, 2013), 33,561 people died in motor vehicle traffic crashes in the United States in 2012, compared to 32,479 in 2011, and it is the first increase in fatalities since 2005. Moreover, in 2012, an estimated 2.36 million people were injured in motor vehicle traffic crashes, compared to 2.22 million in 2011. Due to the demand of highway safety improvements through systematic analysis of specific roadway cross-section elements and treatments, the Highway Safety Manual (HSM) (AASHTO, 2010) was developed by the Transportation Research Board (TRB) to introduce a science-based technical approach for safety analysis. One of the main parts in the HSM, Part D, contains crash modification factors (CMFs) for various treatments on roadway segments and at intersections. A CMF is a factor that can estimate potential changes in crash frequency as a result of implementing a specific treatment (or countermeasure). CMFs in Part D have been developed using high-quality observational before-after studies that account for the regression to the mean threat. Observational before-after studies are the most common methods for evaluating safety effectiveness and calculating CMFs of specific roadway treatments. Moreover, cross-sectional method has commonly been used to derive CMFs since it is easier to collect the data compared to before-after methods.Although various CMFs have been calculated and introduced in the HSM, still there are critical limitations that are required to be investigated. First, the HSM provides various CMFs for single treatments, but not CMFs for multiple treatments to roadway segments. The HSM suggests that CMFs are multiplied to estimate the combined safety effects of single treatments. However, the HSM cautions that the multiplication of the CMFs may over- or under-estimate combined effects of multiple treatments. In this dissertation, several methodologies are proposed to estimate more reliable combined safety effects in both observational before-after studies and the cross-sectional method. Averaging two best combining methods is suggested to use to account for the effects of over- or under- estimation. Moreover, it is recommended to develop adjustment factor and function (i.e. weighting factor and function) to apply to estimate more accurate safety performance in assessing safety effects of multiple treatments. The multivariate adaptive regression splines (MARS) modeling is proposed to avoid the over-estimation problem through consideration of interaction impacts between variables in this dissertation. Second, the variation of CMFs with different roadway characteristics among treated sites over time is ignored because the CMF is a fixed value that represents the overall safety effect of the treatment for all treated sites for specific time periods. Recently, few studies developed crash modification functions (CMFunctions) to overcome this limitation. However, although previous studies assessed the effect of a specific single variable such as AADT on the CMFs, there is a lack of prior studies on the variation in the safety effects of treated sites with different multiple roadway characteristics over time. In this study, adopting various multivariate linear and nonlinear modeling techniques is suggested to develop CMFunctions. Multiple linear regression modeling can be utilized to consider different multiple roadway characteristics. To reflect nonlinearity of predictors, a regression model with nonlinearizing link function needs to be developed. The Bayesian approach can also be adopted due to its strength to avoid the problem of over fitting that occurs when the number of observations is limited and the number of variables is large. Moreover, two data mining techniques (i.e. gradient boosting and MARS) are suggested to use 1) to achieve better performance of CMFunctions with consideration of variable importance, and 2) to reflect both nonlinear trend of predictors and interaction impacts between variables at the same time. Third, the nonlinearity of variables in the cross-sectional method is not discussed in the HSM. Generally, the cross-sectional method is also known as safety performance functions (SPFs) and generalized linear model (GLM) is applied to estimate SPFs. However, the estimated CMFs from GLM cannot account for the nonlinear effect of the treatment since the coefficients in the GLM are assumed to be fixed. In this dissertation, applications of using generalized nonlinear model (GNM) and MARS in the cross-sectional method are proposed. In GNMs, the nonlinear effects of independent variables to crash analysis can be captured by the development of nonlinearizing link function. Moreover, the MARS accommodate nonlinearity of independent variables and interaction effects for complex data structures. In this dissertation, the CMFs and CMFunctions are estimated for various single and combination of treatments for different roadway types (e.g. rural two-lane, rural multi-lane roadways, urban arterials, freeways, etc.) as below:1) Treatments for mainline of roadway: - adding a thru lane, conversion of 4-lane undivided roadways to 3-lane with two-way left turn lane (TWLTL)2) Treatments for roadway shoulder: - installing shoulder rumble strips, widening shoulder width, adding bike lanes, changing bike lane width, installing roadside barriers3) Treatments related to roadside features: - decrease density of driveways, decrease density of roadside poles, increase distance to roadside poles, increase distance to trees Expected contributions of this study are to 1) suggest approaches to estimate more reliable safety effects of multiple treatments, 2) propose methodologies to develop CMFunctions to assess the variation of CMFs with different characteristics among treated sites, and 3) recommend applications of using GNM and MARS to simultaneously consider the interaction impact of more than one variables and nonlinearity of predictors.Finally, potential relevant applications beyond the scope of this research but worth investigation in the future are discussed in this dissertation.
Show less - Date Issued
- 2015
- Identifier
- CFE0005861, ucf:50914
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005861
- Title
- Predictive Modeling of Functional Materials for Catalytic and Sensor Applications.
- Creator
-
Rawal, Takat, Rahman, Talat, Chang, Zenghu, Leuenberger, Michael, Zou, Shengli, University of Central Florida
- Abstract / Description
-
The research conducted in my dissertation focuses on theoretical and computational studies of the electronic and geometrical structures, and the catalytic and optical properties of functional materials in the form of nano-structures, extended surfaces, two-dimensional systems and hybrid structures. The fundamental aspect of my research is to predict nanomaterial properties through ab-initio calculations using methods such as quantum mechanical density functional theory (DFT) and kinetic Monte...
Show moreThe research conducted in my dissertation focuses on theoretical and computational studies of the electronic and geometrical structures, and the catalytic and optical properties of functional materials in the form of nano-structures, extended surfaces, two-dimensional systems and hybrid structures. The fundamental aspect of my research is to predict nanomaterial properties through ab-initio calculations using methods such as quantum mechanical density functional theory (DFT) and kinetic Monte Carlo simulation, which help rationalize experimental observations, and ultimately lead to the rational design of materials for the electronic and energy-related applications. Focusing on the popular single-layer MoS2, I first show how its hybrid structure with 29-atom transition metal nanoparticles (M29 where M=Cu, Ag, and Au) can lead to composite catalysts suitable for oxidation reactions. Interestingly, the effect is found to be most pronounced for Au29 when MoS2 is defect-laden (S vacancy row). Second, I show that defect-laden MoS2 can be functionalized either by deposited Au nanoparticles or when supported on Cu(111) to serve as a cost-effective catalyst for methanol synthesis via CO hydrogenation reactions. The charge transfer and electronic structural changes in these sub systems lead to the presence of 'frontier' states near the Fermi level, making the systems catalytically active. Next, in the emerging area of single metal atom catalysis, I provide rationale for the viability of single Pd sites stabilized on ZnO(101 ?0) as the active sites for methanol partial oxidation, an important reaction for the production of H2. We trace its excellent activity to the modified electronic structure of the single Pd site as well as neighboring Zn cationic sites. With the DFT-calculated activation energy barriers for a large set of reactions, we perform ab-initio kMC simulations to determine the selectivity of the products (CO2 and H2). These findings offer an opportunity for maximizing the efficiency of precious metal atoms, and optimizing their activity and selectivity (for desired products). In related work on extended surfaces while trying to explain the Scanning Tunneling Microscopy images observed by our experimental collaborators, I discovered a new mechanism involved in the process of Ag vacancy formation on Ag(110), in the presence of O atoms which leads to the reconstruction and eventually oxidation of the Ag surface. In a similar vein, I was able to propose a mechanism for the orange photoluminescence (PL), observed by our experimental collaborators, of a coupled system of benzylpiperazine (BZP) molecule and iodine on a copper surface. Our results show that the adsorbed BZP and iodine play complimentary roles in producing the PL in the visible range. Upon photo-excitation of the BZP-I/CuI(111) system, excited electrons are transferred into the conduction band (CB) of CuI, and holes are trapped by the adatoms. The relaxation of holes into BZP HOMO is facilitated by its realignment. Relaxed holes subsequently recombine with excited electrons in the CB of the CuI film, thus producing a luminescence peak at ~2.1 eV. These results can be useful for forensic applications in detecting illicit substances.
Show less - Date Issued
- 2017
- Identifier
- CFE0006783, ucf:51813
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006783
- Title
- Examining Multiple Approaches for the Transferability of Safety Performance Functions.
- Creator
-
Farid, Ahmed Tarek Ahmed, Abdel-Aty, Mohamed, Lee, JaeYoung, Eluru, Naveen, University of Central Florida
- Abstract / Description
-
Safety performance functions (SPFs) are essential in road safety since they are used to predict crash frequencies. They are commonly applied for detecting hot spots in network screening and assessing whether road safety countermeasures are effective. In the Highway Safety Manual (HSM), SPFs are provided for several crash classifications for several types of roadway facilities. The SPFs of the HSM are developed using data from multiple states. In regions where jurisdiction specific SPFs are...
Show moreSafety performance functions (SPFs) are essential in road safety since they are used to predict crash frequencies. They are commonly applied for detecting hot spots in network screening and assessing whether road safety countermeasures are effective. In the Highway Safety Manual (HSM), SPFs are provided for several crash classifications for several types of roadway facilities. The SPFs of the HSM are developed using data from multiple states. In regions where jurisdiction specific SPFs are not available, it is custom to adopt nationwide SPFs for crash predictions then apply a calibration factor. Yet, the research is limited regarding the application of national SPFs for local jurisdictions. In this study, the topic of transferability is explored by examining rural multilane highway SPFs from Florida, Ohio, and California. That is for both divided segments and intersections. Traffic, road geometrics and crash data from the three states are collected to develop one-state, two-state and three-state SPFs. The SPFs are negative binomial models taking the form of those of the HSM. Evaluation of the transferability of models is undertaken by calculating a measure known as the transfer index. It is used to explain which SPFs may be transferred tolerably to other jurisdictions. According to the results, the transferability of rural divided segments' SPFs of Florida to California and vice versa is superior to that of Ohio's SPFs. For four-leg signalized intersections, neither state's models are transferable to any state. Also, the transfer index indicates improved transferability when using pooled data from multiple states. Furthermore, a modified version of the Empirical Bayes method that is responsible for segment specific adjustment factors is proposed as an alternative to the HSM calibration method. It is used to adjust crash frequencies predicted by the SPFs being transferred to the jurisdiction of interest. The modified method, proposed, outperforms the HSM calibration method as per the analysis results.
Show less - Date Issued
- 2015
- Identifier
- CFE0006298, ucf:51604
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006298
- Title
- Secondary and Postsecondary Calculus Instructors' Expectations of Student Knowledge of Functions: A Multiple-case Study.
- Creator
-
Avila, Cheryl, Ortiz, Enrique, Dixon, Juli, Hynes, Michael, Andreasen, Janet, Mohapatra, Ram, University of Central Florida
- Abstract / Description
-
This multiple-case study examines the explicit and implicit assumptions of six veteran calculus instructors from three types of educational institutions, comparing and contrasting their views on the iteration of conceptual understanding and procedural fluency of pre-calculus topics. There were three components to the research data recording process. The first component was a written survey, the second component was a (")think-aloud(") activity of the instructors analyzing the results of a...
Show moreThis multiple-case study examines the explicit and implicit assumptions of six veteran calculus instructors from three types of educational institutions, comparing and contrasting their views on the iteration of conceptual understanding and procedural fluency of pre-calculus topics. There were three components to the research data recording process. The first component was a written survey, the second component was a (")think-aloud(") activity of the instructors analyzing the results of a function diagnostic instrument administered to a calculus class, and for the third component, the instructors responded to two quotations. As a result of this activity, themes were found between and among instructors at the three types of educational institutions related to their expectations of their incoming students' prior knowledge of pre-calculus topics related to functions. Differences between instructors of the three types of educational institutions included two identifiable areas: (1) the teachers' expectations of their incoming students and (2) the methods for planning instruction. In spite of these differences, the veteran instructors were in agreement with other studies' findings that an iterative approach to conceptual understanding and procedural fluency are necessary for student understanding of pre-calculus concepts.
Show less - Date Issued
- 2013
- Identifier
- CFE0004809, ucf:49758
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004809
- Title
- Functional Scaffolding for Musical Composition: A New Approach in Computer-Assisted Music Composition.
- Creator
-
Hoover, Amy, Stanley, Kenneth, Wu, Annie, Laviola II, Joseph, Anderson, Thaddeus, University of Central Florida
- Abstract / Description
-
While it is important for systems intended to enhance musical creativity to define and explore musical ideas conceived by individual users, many limit musical freedom by focusing on maintaining musical structure, thereby impeding the user's freedom to explore his or her individual style. This dissertation presents a comprehensive body of work that introduces a new musical representation that allows users to explore a space of musical rules that are created from their own melodies. This...
Show moreWhile it is important for systems intended to enhance musical creativity to define and explore musical ideas conceived by individual users, many limit musical freedom by focusing on maintaining musical structure, thereby impeding the user's freedom to explore his or her individual style. This dissertation presents a comprehensive body of work that introduces a new musical representation that allows users to explore a space of musical rules that are created from their own melodies. This representation, called functional scaffolding for musical composition (FSMC), exploits a simple yet powerful property of multipart compositions: The pattern of notes and rhythms in different instrumental parts of the same song are functionally related. That is, in principle, one part can be expressed as a function of another. Music in FSMC is represented accordingly as a functional relationship between an existing human composition, or scaffold, and an additional generated voice. This relationship is encoded by a type of artificial neural network called a compositional pattern producing network (CPPN). A human user without any musical expertise can then explore how these additional generated voices should relate to the scaffold through an interactive evolutionary process akin to animal breeding. The utility of this insight is validated by two implementations of FSMC called NEAT Drummer and MaestroGenesis, that respectively help users tailor drum patterns and complete multipart arrangements from as little as a single original monophonic track. The five major contributions of this work address the overarching hypothesis in this dissertation that functional relationships alone, rather than specialized music theory, are sufficient for generating plausible additional voices. First, to validate FSMC and determine whether plausible generated voices result from the human-composed scaffold or intrinsic properties of the CPPN, drum patterns are created with NEAT Drummer to accompany several different polyphonic pieces. Extending the FSMC approach to generate pitched voices, the second contribution reinforces the importance of functional transformations through quality assessments that indicate that some partially FSMC-generated pieces are indistinguishable from those that are fully human. While the third contribution focuses on constructing and exploring a space of plausible voices with MaestroGenesis, the fourth presents results from a two-year study where students discuss their creative experience with the program. Finally, the fifth contribution is a plugin for MaestroGenesis called MaestroGenesis Voice (MG-V) that provides users a more natural way to incorporate MaestroGenesis in their creative endeavors by allowing scaffold creation through the human voice. Together, the chapters in this dissertation constitute a comprehensive approach to assisted music generation, enabling creativity without the need for musical expertise.
Show less - Date Issued
- 2014
- Identifier
- CFE0005350, ucf:50495
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005350
- Title
- Transferability and Calibration of the Highway Safety Manual Performance Functions and Development of New Models for Urban four-lane Divided Roads.
- Creator
-
Al Kaaf, Khalid, Abdel-Aty, Mohamed, Oloufa, Amr, Tatari, Omer, Lee, JaeYoung, University of Central Florida
- Abstract / Description
-
Many developing countries have witnessed fast and rapid growth in the last two decades due to the high development rate of economic activity in these countries. Many transportation projects have been constructed. In the same time both population growth and vehicle ownership rate increased; resulting in increasing levels of road crashes. Road traffic crashes in Gulf Cooperation Council (GCC) is considered a serious problem that has deep effects on GCC's population as well as on the national...
Show moreMany developing countries have witnessed fast and rapid growth in the last two decades due to the high development rate of economic activity in these countries. Many transportation projects have been constructed. In the same time both population growth and vehicle ownership rate increased; resulting in increasing levels of road crashes. Road traffic crashes in Gulf Cooperation Council (GCC) is considered a serious problem that has deep effects on GCC's population as well as on the national productivity through the loss of lives, injuries, property damage and the loss of valuable resources. From a recent statistical study of traffic crashes in Oman, it was found that in 2013 there were 7,829 crashes occurred for a total of 1,082,996 registered vehicles. These crashes have resulted in 913, 5591, and 1481 fatal, injury and property damage only crashes, respectively (Directorate General of Traffic, 2014), which is considered high rates of fatalities and injuries compared to other more developed countries. This illustrates the seriousness and dangerousness of the safety situation in GCC countries and Oman particularly. Thus, there is an urgent need to alleviate the Severity of the traffic safety problem in GCC which in turn will set a prime example for other developing countries that face similar problems. Two main data sources from Riyadh, the capital city of Kingdom of Saudi Arabia (KSA) and Muscat, the capital city of Sultanate of Oman have been obtained, processed, and utilized in this study. The Riyadh collision and traffic data for this study were obtained in the form of crash database and GIS maps from two main sources: the Higher Commission for the Development of Riyadh (HCDR) and Riyadh Traffic Department (RTD). The Muscat collision and traffic data were obtained from two main sources: the Muscat Municipality (MM) and Royal Oman Police, Directorate General of Traffic (DGC). Since the ARC GIS is still not used for traffic crash geocoding in Oman, the crash data used in the analysis were extracted manually from the filing system in the DGC.Due to the fact that not all developing countries highway agencies possess sufficient crash data that enable the development of robust models, this problem gives rise to the interest of transferability of many of the models and tools developed in the US and other developed nations. The Highway Safety Manual (HSM) is a prime and comprehensive resource recently developed in the US that would have substantial impact if researchers are able to transfer its models to other similar environment in GCC. It would save time, effort, and money. The first edition of the HSM provides a number of safety performance functions (SPFs), which can be used to predict collisions on a roadway network. This dissertation examined the Transferability of HSM SPFs and developing new local models for Riyadh and Muscat.In this study, first, calibration of the HSM SPFs for Urban Four-lane divided roadway segments (U4D) with angle parking in Riyadh and the development of new SPFs were examined. The study calibrates the HSM SPFs using HSM default Crash Modification Factors (CMFs), then new local CMFs is proposed using cross-sectional method, which treats the estimation of calibration factors using fatal and injury data. In addition, new forms for specific SPFs are further evaluated to identify the best model using the Poisson-Gamma regression technique. To investigate how well the safety performance model fits the data set, several performance measures were examined. The performance measures summarize the differences between the observed and predicted values from related SPFs. Results indicate that the jurisdiction-specific SPFs provided the best fit of the data used in this study, and would be the best SPFs for predicting severe collisions in the City of Riyadh. The study finds that the HSM calibration using Riyadh local CMFs outperforms the calibration method using the HSM default values. The HSM calibration application for Riyadh crash conditions highlights the importance to address variability in reporting thresholds. One of the findings of this research is that, while the medians in this study have oversize widths ranging from 16ft-70ft, median width has insignificant effect on fatal and injury crashes. At the same time the frequent angle parking in Riyadh urban road networks seems to increase the fatal and injury collisions by 52 percent. On the other hand, this dissertation examined the calibration of the HSM SPFs for Urban intersections in Riyadh, Kingdom of Saudi Arabia (KSA) and the development of new set of models using three year of collision data (2004-2006) from the city of Riyadh. Three intersection categories were investigated: 3-leg signalized, 4-leg signalized, and 3-leg unsignalized. In addition, new forms for specific SPFs are further evaluated to identify the best model using the Poisson-Gamma regression technique. Results indicate that the new local developed SPFs provided the best fit of the data used in this study, and would be the best SPFs for predicting severe crashes at urban intersections in the City of RiyadhMoreover, this study examined the calibration of the HSM SPFs for Fatal and Injury (FI), Property Damage Only (PDO) and total crashes for Urban Four-lane divided roadway segments (U4D) in Muscat, Sultanate of Oman and the development of new SPFs. This study first calibrates the HSM SPFs using the HSM methodology, and then new forms for specific SPFs are further evaluated for Muscat's urban roads to identify the best model. Finally, Riyadh fatal and injury model were validated using Muscat FI dataset.Comparisons across the models indicate that HSM calibrated models are superior with a better model fit and would be the best SPFs for predicting collisions in the City of Muscat. The best developed collision model describes the mean crash frequency as a function of natural logarithm of the annual average daily traffic, segment length, and speed limit. The study finds that the differences in road geometric design features and FI collision characteristics between Riyadh and Muscat resulted in an un-transferable Riyadh crash prediction model.Overall, this study lays an important foundation towards the implementation of HSM methods in multiple cities (Riyadh and Muscat), and could help their transportation officials to make informed decisions regarding road safety programs. The implications of the results are extendible to other cities and countries and the region, and perhaps other developing countries as well.
Show less - Date Issued
- 2014
- Identifier
- CFE0005452, ucf:50378
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005452
- Title
- APPLICATION OF THE EMPIRICAL LIKELIHOOD METHOD IN PROPORTIONAL HAZARDS MODEL.
- Creator
-
HE, BIN, Ren, Jian-Jian, University of Central Florida
- Abstract / Description
-
In survival analysis, proportional hazards model is the most commonly used and the Cox model is the most popular. These models are developed to facilitate statistical analysis frequently encountered in medical research or reliability studies. In analyzing real data sets, checking the validity of the model assumptions is a key component. However, the presence of complicated types of censoring such as double censoring and partly interval-censoring in survival data makes model assessment...
Show moreIn survival analysis, proportional hazards model is the most commonly used and the Cox model is the most popular. These models are developed to facilitate statistical analysis frequently encountered in medical research or reliability studies. In analyzing real data sets, checking the validity of the model assumptions is a key component. However, the presence of complicated types of censoring such as double censoring and partly interval-censoring in survival data makes model assessment difficult, and the existing tests for goodness-of-fit do not have direct extension to these complicated types of censored data. In this work, we use empirical likelihood (Owen, 1988) approach to construct goodness-of-fit test and provide estimates for the Cox model with various types of censored data.Specifically, the problems under consideration are the two-sample Cox model and stratified Cox model with right censored data, doubly censored data and partly interval-censored data. Related computational issues are discussed, and some simulation results are presented. The procedures developed in the work are applied to several real data sets with some discussion.
Show less - Date Issued
- 2006
- Identifier
- CFE0001099, ucf:46780
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001099
- Title
- Evolution and distribution of phenotypic diversity in the venom of Mojave Rattlesnakes (Crotalus scutulatus).
- Creator
-
Strickland, Jason, Savage, Anna, Parkinson, Christopher, Hoffman, Eric, Rokyta, Darin, University of Central Florida
- Abstract / Description
-
Intraspecific phenotype diversity allows for local adaption and the ability for species to respond to changing environmental conditions, enhancing survivability. Phenotypic variation could be stochastic, genetically based, and/or the result of different environmental conditions. Mojave Rattlesnakes, Crotalus scutulatus, are known to have high intraspecific venom variation, but the geographic extent of the variation and factors influencing venom evolution are poorly understood. Three primary...
Show moreIntraspecific phenotype diversity allows for local adaption and the ability for species to respond to changing environmental conditions, enhancing survivability. Phenotypic variation could be stochastic, genetically based, and/or the result of different environmental conditions. Mojave Rattlesnakes, Crotalus scutulatus, are known to have high intraspecific venom variation, but the geographic extent of the variation and factors influencing venom evolution are poorly understood. Three primary venom types have been described in this species based on the presence (Type A) or absence (Type B) of a neurotoxic phospholipase A2 called Mojave toxin and an inverse relationship with the presence of snake venom metalloproteinases (SVMPs). Individuals that contain both Mojave toxin and SVMPs, although rare, are the third, and designated Type A + B. I sought to describe the proteomic and transcriptomic venom diversity of C. scutulatus across its range and test whether diversity was correlated with genetic or environmental differences. This study includes the highest geographic sampling of Mojave Rattlesnakes and includes the most venom-gland transcriptomes known for one species. Of the four mitochondrial lineages known, only one was monophyletic for venom type. Environmental variables poorly correlated with the phenotypes. Variability in toxin and toxin family composition of venom transcriptomes was largely due to differences in transcript expression. Four of 19 toxin families identified in C. scutulatus account for the majority of differences in toxin number and expression variation. I was able to determine that the toxins primarily responsible for venom types are inherited in a Mendelian fashion and that toxin expression is additive when comparing heterozygotes and homozygotes. Using the genetics to define venom type is more informative and the Type A + B phenotype is not unique, but rather heterozygous for the PLA2 and/or SVMP alleles. Intraspecific venom variation in C. scutulatus highlights the need for fine scale ecological and natural history information to understand how phenotypic diversity is generated and maintained geographically through time.
Show less - Date Issued
- 2018
- Identifier
- CFE0007252, ucf:52198
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007252
- Title
- ENTERPRISE BUSINESS ALIGNMENT USING QUALITY FUNCTION DEPLOYMENT, MULTIVARIATE DATA ANALYSIS AND BUSINESS MODELING TOOLS.
- Creator
-
GAMMOH, DIALA, Elshennawy, Ahmad, University of Central Florida
- Abstract / Description
-
This dissertation proposes two novel ideas to enhance the business strategy alignment to customer needs. The proposed business alignment clock is a new illustration to the relationships between customer requirements, business strategies, capabilities and processes. To line up the clock and reach the needed alignment for the enterprise, a proposed clock mechanism is introduced. The mechanism integrates the Enterprise Business Architecture (EBA) with the House of Quality (HoQ). The relationship...
Show moreThis dissertation proposes two novel ideas to enhance the business strategy alignment to customer needs. The proposed business alignment clock is a new illustration to the relationships between customer requirements, business strategies, capabilities and processes. To line up the clock and reach the needed alignment for the enterprise, a proposed clock mechanism is introduced. The mechanism integrates the Enterprise Business Architecture (EBA) with the House of Quality (HoQ). The relationship matrix inside the body of the house is defined using multivariate data analysis techniques to accurately measure the strength of the relationships rather than defining them subjectively. A statistical tool, multivariate data analysis, can be used to overcome the ambiguity in quantifying the relationships in the house of quality matrix. The framework is proposed in the basic conceptual model context of the EBA showing different levels of the enterprise architecture; the goals, the capabilities and the value stream architecture components. In the proposed framework, the goals and the capabilities are inputs to two houses of quality, in which the alignment between customer needs and business goals, and the alignment between business goals and capabilities are checked in the first house and the second house, respectively. The alignment between the business capabilities and the architecture components (workflows, events and environment) is checked in a third HoQ using the performance indicators of the value stream architecture components, which may result in infrastructure expansion, software development or process improvement to reach the needed alignment by the enterprise. The value of the model was demonstrated using the Accreditation Board of Engineering and Technology (ABET) process at the Industrial Engineering and Management Systems department at the University of Central Florida. The assessment of ABET criteria involves an evaluation of the extent to which the program outcomes are being achieved and results in decisions and actions to improve the Industrial Engineering program at the University of Central Florida. The proposed framework increases the accuracy of measuring the extent to which the program learning outcomes have been achieved at the department. The process of continuous alignment between the educational objectives and customer needs becomes more vital by the rapid change of customer requirements that are obtained from both internal and external constituents (students, faculty, alumni, and employers in the first place).
Show less - Date Issued
- 2010
- Identifier
- CFE0003298, ucf:48506
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0003298
- Title
- Automatic Detection of Brain Functional Disorder Using Imaging Data.
- Creator
-
Dey, Soumyabrata, Shah, Mubarak, Jha, Sumit, Hu, Haiyan, Weeks, Arthur, Rao, Ravishankar, University of Central Florida
- Abstract / Description
-
Recently, Attention Deficit Hyperactive Disorder (ADHD) is getting a lot of attention mainly for two reasons. First, it is one of the most commonly found childhood behavioral disorders. Around 5-10% of the children all over the world are diagnosed with ADHD. Second, the root cause of the problem is still unknown and therefore no biological measure exists to diagnose ADHD. Instead, doctors need to diagnose it based on the clinical symptoms, such as inattention, impulsivity and hyperactivity,...
Show moreRecently, Attention Deficit Hyperactive Disorder (ADHD) is getting a lot of attention mainly for two reasons. First, it is one of the most commonly found childhood behavioral disorders. Around 5-10% of the children all over the world are diagnosed with ADHD. Second, the root cause of the problem is still unknown and therefore no biological measure exists to diagnose ADHD. Instead, doctors need to diagnose it based on the clinical symptoms, such as inattention, impulsivity and hyperactivity, which are all subjective.Functional Magnetic Resonance Imaging (fMRI) data has become a popular tool to understand the functioning of the brain such as identifying the brain regions responsible for different cognitive tasks or analyzing the statistical differences of the brain functioning between the diseased and control subjects. ADHD is also being studied using the fMRI data. In this dissertation we aim to solve the problem of automatic diagnosis of the ADHD subjects using their resting state fMRI (rs-fMRI) data.As a core step of our approach, we model the functions of a brain as a connectivity network, which is expected to capture the information about how synchronous different brain regions are in terms of their functional activities. The network is constructed by representing different brain regions as the nodes where any two nodes of the network are connected by an edge if the correlation of the activity patterns of the two nodes is higher than some threshold. The brain regions, represented as the nodes of the network, can be selected at different granularities e.g. single voxels or cluster of functionally homogeneous voxels. The topological differences of the constructed networks of the ADHD and control group of subjects are then exploited in the classification approach.We have developed a simple method employing the Bag-of-Words (BoW) framework for the classification of the ADHD subjects. We represent each node in the network by a 4-D feature vector: node degree and 3-D location. The 4-D vectors of all the network nodes of the training data are then grouped in a number of clusters using K-means; where each such cluster is termed as a word. Finally, each subject is represented by a histogram (bag) of such words. The Support Vector Machine (SVM) classifier is used for the detection of the ADHD subjects using their histogram representation. The method is able to achieve 64% classification accuracy.The above simple approach has several shortcomings. First, there is a loss of spatial information while constructing the histogram because it only counts the occurrences of words ignoring the spatial positions. Second, features from the whole brain are used for classification, but some of the brain regions may not contain any useful information and may only increase the feature dimensions and noise of the system. Third, in our study we used only one network feature, the degree of a node which measures the connectivity of the node, while other complex network features may be useful for solving the proposed problem.In order to address the above shortcomings, we hypothesize that only a subset of the nodes of the network possesses important information for the classification of the ADHD subjects. To identify the important nodes of the network we have developed a novel algorithm. The algorithm generates different random subset of nodes each time extracting the features from a subset to compute the feature vector and perform classification. The subsets are then ranked based on the classification accuracy and the occurrences of each node in the top ranked subsets are measured. Our algorithm selects the highly occurring nodes for the final classification. Furthermore, along with the node degree, we employ three more node features: network cycles, the varying distance degree and the edge weight sum. We concatenate the features of the selected nodes in a fixed order to preserve the relative spatial information. Experimental validation suggests that the use of the features from the nodes selected using our algorithm indeed help to improve the classification accuracy. Also, our finding is in concordance with the existing literature as the brain regions identified by our algorithms are independently found by many other studies on the ADHD. We achieved a classification accuracy of 69.59% using this approach. However, since this method represents each voxel as a node of the network which makes the number of nodes of the network several thousands. As a result, the network construction step becomes computationally very expensive. Another limitation of the approach is that the network features, which are computed for each node of the network, captures only the local structures while ignore the global structure of the network.Next, in order to capture the global structure of the networks, we use the Multi-Dimensional Scaling (MDS) technique to project all the subjects from an unknown network-space to a low dimensional space based on their inter-network distance measures. For the purpose of computing distance between two networks, we represent each node by a set of attributes such as the node degree, the average power, the physical location, the neighbor node degrees, and the average powers of the neighbor nodes. The nodes of the two networks are then mapped in such a way that for all pair of nodes, the sum of the attribute distances, which is the inter-network distance, is minimized. To reduce the network computation cost, we enforce that the maximum relevant information is preserved with minimum redundancy. To achieve this, the nodes of the network are constructed with clusters of highly active voxels while the activity levels of the voxels are measured based on the average power of their corresponding fMRI time-series. Our method shows promise as we achieve impressive classification accuracies (73.55%) on the ADHD-200 data set. Our results also reveal that the detection rates are higher when classification is performed separately on the male and female groups of subjects.So far, we have only used the fMRI data for solving the ADHD diagnosis problem. Finally, we investigated the answers of the following questions. Do the structural brain images contain useful information related to the ADHD diagnosis problem? Can the classification accuracy of the automatic diagnosis system be improved combining the information of the structural and functional brain data? Towards that end, we developed a new method to combine the information of structural and functional brain images in a late fusion framework. For structural data we input the gray matter (GM) brain images to a Convolutional Neural Network (CNN). The output of the CNN is a feature vector per subject which is used to train the SVM classifier. For the functional data we compute the average power of each voxel based on its fMRI time series. The average power of the fMRI time series of a voxel measures the activity level of the voxel. We found significant differences in the voxel power distribution patterns of the ADHD and control groups of subjects. The Local binary pattern (LBP) texture feature is used on the voxel power map to capture these differences. We achieved 74.23% accuracy using GM features, 77.30% using LBP features and 79.14% using combined information.In summary this dissertation demonstrated that the structural and functional brain imaging data are useful for the automatic detection of the ADHD subjects as we achieve impressive classification accuracies on the ADHD-200 data set. Our study also helps to identify the brain regions which are useful for ADHD subject classification. These findings can help in understanding the pathophysiology of the problem. Finally, we expect that our approaches will contribute towards the development of a biological measure for the diagnosis of the ADHD subjects.
Show less - Date Issued
- 2014
- Identifier
- CFE0005786, ucf:50060
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005786
- Title
- Investigating the universality and comprehensive ability of measures to assess the state of workload.
- Creator
-
Abich, Julian, Reinerman, Lauren, Lackey, Stephanie, Szalma, James, Taylor, Grant, University of Central Florida
- Abstract / Description
-
Measures of workload have been developed on the basis of the various definitions, some are designed to capture the multi-dimensional aspects of a unitary resource pool (Kahneman, 1973) while others are developed on the basis of multiple resource theory (Wickens, 2002). Although many theory based workload measures exist, others have often been constructed to serve the purpose of specific experimental tasks. As a result, it is likely that not every workload measure is reliable and valid for all...
Show moreMeasures of workload have been developed on the basis of the various definitions, some are designed to capture the multi-dimensional aspects of a unitary resource pool (Kahneman, 1973) while others are developed on the basis of multiple resource theory (Wickens, 2002). Although many theory based workload measures exist, others have often been constructed to serve the purpose of specific experimental tasks. As a result, it is likely that not every workload measure is reliable and valid for all tasks, much less each domain. To date, no single measure, systematically tested across experimental tasks, domains, and other measures is considered a universal measure of workload. Most researchers would argue that multiple measures from various categories should be applied to a given task to comprehensively assess workload. The goal for Study 1 to establish task load manipulations for two theoretically different tasks that induce distinct levels of workload assessed by both subjective and performance measures was successful. The results of the subjective responses support standardization and validation of the tasks and demands of that task for investigating workload. After investigating the use of subjective and objective measures of workload to identify a universal and comprehensive measure or set of measures, based on Study 2, it can only be concluded that not one or a set of measures exists. Arguably, it is not to say that one will never be conceived and developed, but at this time, one does not reside in the psychometric catalog. Instead, it appears that a more suitable approach is to customize a set of workload measures based on the task. The novel approach of assessing the sensitivity and comprehensive ability of conjointly utilizing subjective, performance, and physiological workload measures for theoretically different tasks within the same domain contributes to the theory by laying the foundation for improving methodology for researching workload. The applicable contribution of this project is a stepping-stone towards developing complex profiles of workload for use in closed-loop systems, such as human-robot team interaction. Identifying the best combination of workload measures enables human factors practitioners, trainers, and task designers to improve methodology and evaluation of system designs, training requirements, and personnel selection.
Show less - Date Issued
- 2013
- Identifier
- CFE0005119, ucf:50675
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005119
- Title
- Development of Traffic Safety Zones and Integrating Macroscopic and Microscopic Safety Data Analytics for Novel Hot Zone Identification.
- Creator
-
Lee, JaeYoung, Abdel-Aty, Mohamed, Radwan, Ahmed, Nam, Boo Hyun, Kuo, Pei-Fen, Choi, Keechoo, University of Central Florida
- Abstract / Description
-
Traffic safety has been considered one of the most important issues in the transportation field. With consistent efforts of transportation engineers, Federal, State and local government officials, both fatalities and fatality rates from road traffic crashes in the United States have steadily declined from 2006 to 2011.Nevertheless, fatalities from traffic crashes slightly increased in 2012 (NHTSA, 2013). We lost 33,561 lives from road traffic crashes in the year 2012, and the road traffic...
Show moreTraffic safety has been considered one of the most important issues in the transportation field. With consistent efforts of transportation engineers, Federal, State and local government officials, both fatalities and fatality rates from road traffic crashes in the United States have steadily declined from 2006 to 2011.Nevertheless, fatalities from traffic crashes slightly increased in 2012 (NHTSA, 2013). We lost 33,561 lives from road traffic crashes in the year 2012, and the road traffic crashes are still one of the leading causes of deaths, according to the Centers for Disease Control and Prevention (CDC). In recent years, efforts to incorporate traffic safety into transportation planning has been made, which is termed as transportation safety planning (TSP). The Safe, Affordable, Flexible Efficient, Transportation Equity Act (-) A Legacy for Users (SAFETEA-LU), which is compliant with the United States Code, compels the United States Department of Transportation to consider traffic safety in the long-term transportation planning process. Although considerable macro-level studies have been conducted to facilitate the implementation of TSP, still there are critical limitations in macroscopic safety studies are required to be investigated and remedied. First, TAZ (Traffic Analysis Zone), which is most widely used in travel demand forecasting, has crucial shortcomings for macro-level safety modeling. Moreover, macro-level safety models have accuracy problem. The low prediction power of the model may be caused by crashes that occur near the boundaries of zones, high-level aggregation, and neglecting spatial autocorrelation.In this dissertation, several methodologies are proposed to alleviate these limitations in the macro-level safety research. TSAZ (Traffic Safety Analysis Zone) is developed as a new zonal system for the macroscopic safety analysis and nested structured modeling method is suggested to improve the model performance. Also, a multivariate statistical modeling method for multiple crash types is proposed in this dissertation. Besides, a novel screening methodology for integrating two levels is suggested. The integrated screening method is suggested to overcome shortcomings of zonal-level screening, since the zonal-level screening cannot take specific sites with high risks into consideration. It is expected that the integrated screening approach can provide a comprehensive perspective by balancing two aspects: macroscopic and microscopic approaches.
Show less - Date Issued
- 2014
- Identifier
- CFE0005195, ucf:50653
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005195
- Title
- Measuring and Modeling NMR and Emission Spectra to Gain New Insight into Challenging Organic Compounds.
- Creator
-
Powell, Jacob, Harper, James, Campiglia, Andres, Beazley, Melanie, Richardson, David, Blair, Richard, University of Central Florida
- Abstract / Description
-
The advancement of theoretical methods in recent years has allowed the calculation of highly accurate spectroscopic parameters. Comparing these values to the corresponding experimental data can allow molecular structures to be elucidated. This dissertation details the use of experimental and theoretical data from nuclear magnetic resonance (NMR) and fluorescence spectroscopy to determine structure. Herein the NMR focus is on measuring (&) modeling chemical shift anisotropy and one-bond carbon...
Show moreThe advancement of theoretical methods in recent years has allowed the calculation of highly accurate spectroscopic parameters. Comparing these values to the corresponding experimental data can allow molecular structures to be elucidated. This dissertation details the use of experimental and theoretical data from nuclear magnetic resonance (NMR) and fluorescence spectroscopy to determine structure. Herein the NMR focus is on measuring (&) modeling chemical shift anisotropy and one-bond carbon-carbon J-coupling constants (1JCC). The fluorescence analysis models vibrationally resolved fluorescence spectra.Chemical shift anisotropy techniques were used to study two conflicting crystal structures of the n-alkyl fatty acid, lauric acid. These two crystal structures differ only in their COOH conformation. Lattice-including density functional theory (DFT) refinements of each crystal structure failed to match experimental data leading to the proposal of a third crystal structure with a hydrogen disordered COOH moiety. This disorder strengthens the hydrogen bond providing a new rationalization to the long observed non-monotonic melting behavior of fatty acids having even and odd numbers of carbons.The INADEQUATE is a NMR experiment that directly establishes the skeleton of organic compounds by measuring the 1JCC throughout a molecule. The low occurrence of 13C-13C pairs (1 in 10,000) and breaks in connectivity due to the presence of heteroatoms causes challenges to INADEQUATE analysis. Here, the insensitivity problem is overcome using analysis software that automatically processes data and identifies signals, even when they are comparable in magnitude to noise. When combined with DFT 1JCC predictions,configuration and confirmations of the natural products 5-methylmellein and hydroheptelidic acid are elucidated.Vibrationally resolved fluorescence spectra of high molecular weight PAHs can be accurately calculated through time-dependent density functional theory (TD-DFT) methods. Here, the theoretical spectral profiles of certain PAHs are shown to match experimental high- resolution fluorescence spectra acquired at cryogenic temperatures. However, in all cases, theoretical spectra were systematically offset from experimental spectra. To decrease these uncertainties spectra were empirically corrected and an automated scheme employed to match theoretical spectra with all possible experimental spectra. In all cases the theoretical spectra were correctly matched to the experimental spectra.
Show less - Date Issued
- 2017
- Identifier
- CFE0006953, ucf:51680
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006953
- Title
- TRAFFIC SAFETY ASSESSMENT OF DIFFERENT TOLL COLLECTION SYSTEMS ON EXPRESSWAYS USING MULTIPLE ANALYTICAL TECHNIQUES.
- Creator
-
Abuzwidah, Muamer, Abdel-Aty, Mohamed, Radwan, Essam, Uddin, Nizam, University of Central Florida
- Abstract / Description
-
Traffic safety has been considered one of the most important issues in the transportation field. Crashes have caused extensive human and economic losses. With the objective of reducing crash occurrence and alleviating crash injury severity, major efforts have been dedicated to reveal the hazardous factors that affect crash occurrence. With these consistent efforts, both fatalities and fatality rates from road traffic crashes in many countries have been steadily declining over the last ten...
Show moreTraffic safety has been considered one of the most important issues in the transportation field. Crashes have caused extensive human and economic losses. With the objective of reducing crash occurrence and alleviating crash injury severity, major efforts have been dedicated to reveal the hazardous factors that affect crash occurrence. With these consistent efforts, both fatalities and fatality rates from road traffic crashes in many countries have been steadily declining over the last ten years. Nevertheless, according to the World Health Organization, the world still lost 1.24 million lives from road traffic crashes in the year of 2013. And without action, traffic crashes on the roads network are predicted to result in deaths of around 1.9 million people, and up to 50 million more people suffer non-fatal injuries annually, with many incurring a disability as a result of their injury by the year 2020. To meet the transportation needs, the use of expressways (toll roads) has risen dramatically in many countries in the past decade. In fact, freeways and expressways are considered an important part of any successful transportation system. These facilities carry the majority of daily trips on the transportation network. Although expressways offer high level of service, and are considered the safest among other types of roads, traditional toll collection systems may have both safety and operational challenges. The traditional toll plazas still experience many crashes, many of which are severe. Therefore, it becomes more important to evaluate the traffic safety impacts of using different tolling systems. The main focus of the research in this dissertation is to provide an up-to-date safety impact of using different toll collection systems, as well as providing safety guidelines for these facilities to promote safety and enhance mobility on expressways. In this study, an extensive data collection was conducted that included one hundred mainline toll plazas located on approximately 750 miles of expressways in Florida. Multiple sources of data available online maintained by Florida Department of Transportation were utilized to identify traffic, geometric and geographic characteristics of the locations as well as investigating and determination of the most complete and accurate data. Different methods of observational before-after and Cross-Sectional techniques were used to evaluate the safety effectiveness of applying different treatments on expressways. The Before-After method includes Na(&)#239;ve Before-After, Before-After with Comparison Group, and Before-After with Empirical Bayesian. A set of Safety Performance Functions (SPFs) which predict crash frequency as a function of explanatory variables were developed at the aggregate level using crash data and the corresponding exposure and risk factors. Results of the aggregate traffic safety analysis can be used to identify the hazardous locations (hot spots) such as traditional toll plazas, and also to predict crash frequency for untreated sites in the after period in the Before-After with EB method or derive Crash Modification Factors (CMF) for the treatment using the Cross-Sectional method. This type of analysis is usually used to improve geometric characteristics and mainly focus on discovering the risk factors that are related to the total crash frequency, specific crash type, and/or different crash severity levels. Both simple SPFs (with traffic volume only as an explanatory variable) and full SPFs (with traffic volume and additional explanatory variable(s)) were used to estimate the CMFs and only CMFs with lower standard error were recommended.The results of this study proved that safety effectiveness was significantly improved across all locations that were upgraded from Traditional Mainline Toll Plazas (TMTP) to the Hybrid Mainline Toll Plazas (HMTP) system. This treatment significantly reduced total, Fatal-and-Injury (F+I), and Rear-End crashes by 47, 46 and 65 percent, respectively. Moreover, this study examined the traffic safety impact of using different designs, and diverge-and-merge areas of the HMTP. This design combines either express Open Road Tolling (ORT) lanes on the mainline and separate traditional toll collection to the side (design-1), or traditional toll collection on the mainline and separate ORT lanes to the side (design-2). It was also proven that there is a significant difference between these designs, and there is an indication that design-1 is safer and the majority of crashes occurred at diverge-and-merge areas before and after these facilities. However, design-2 could be a good temporary design at locations that have low prepaid transponder (Electronic Toll Collection (ETC)) users. In other words, it is dependent upon the percentage of the ETC users. As this percentage increases, more traffic will need to diverge and merge; thus, this design becomes riskier. In addition, the results indicated significant relationships between the crash frequency and toll plaza types, annual average daily traffic, and drivers' age. The analysis showed that the conversion from TMTP to the All-Electronic Toll Collection (AETC) system resulted in an average reduction of 77, 76, and 67 percent for total, F+I, and Property Damage Only (PDO) crashes, respectively; for rear end and Lane Change Related (LCR) crashes the average reductions were 81 and 75 percent, respectively. The conversion from HMTP to AETC system enhanced traffic safety by reducing crashes by an average of 23, 29 and 19 percent for total, F+I, and PDO crashes; also, for rear end and LCR crashes, the average reductions were 15 and 21 percent, respectively. Based on these results, the use of AETC system changed toll plazas from the highest risk sections on Expressways to be similar to regular segments. Therefore, it can be concluded that the use of AETC system was proven to be an excellent solution to several traffic operations as well as environmental and economic problems. For those agencies that cannot adopt the HMTP and the AETC systems, improving traffic safety at traditional toll plazas should take a priority.This study also evaluates the safety effectiveness of the implementation of High-Occupancy Toll lanes (HOT Lanes) as well as adding roadway lighting to expressways. The results showed that there were no significant impact of the implementation of HOT lanes on the roadway segment as a whole (HOT and Regular Lanes combined). But there was a significant difference between the regular lanes and the HOT lanes at the same roadway segment; the crash count increased at the regular lanes and decreased at the HOT lanes. It was found that the total and F+I crashes were reduced at the HOT lanes by an average of 25 and 45 percent, respectively. This may be attributable to the fact that the HOT lanes became a highway within a highway. Moreover adding roadway lighting has significantly improved traffic safety on the expressways by reducing the night crashes by approximately 35 percent.Overall, the proposed analyses of the safety effectiveness of using different toll collection systems are useful in providing expressway authorities with detailed information on where countermeasures must be implemented. This study provided for the first time an up-to-date safety impact of using different toll collection systems, also developed safety guidelines for these systems which would be useful for practitioners and roadway users.
Show less - Date Issued
- 2014
- Identifier
- CFE0005751, ucf:50100
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005751