Current Search: Simulation (x)
View All Items
Pages
- Title
- The Experience of Physical and Social Presence in a Virtual Learning Environment as Impacted by the Affordance of Movement Enabled by Motion Tracking.
- Creator
-
Hayes, Aleshia, Hughes, Charles, Dieker, Lisa, Marino, Matthew, Bailenson, Jeremy, University of Central Florida
- Abstract / Description
-
This research synthesizes existing research findings that social presence (sense of connection with others) and physical presence (sense of being there) increase learning outcomes in Virtual Learning Environments (VLEs) with findings that traditional motion tracking of participants wearing head mounted displays in virtual reality increases both physical and social presence. This information suggests that motion tracking in mixed reality VLEs has a positive impact on social presence and on...
Show moreThis research synthesizes existing research findings that social presence (sense of connection with others) and physical presence (sense of being there) increase learning outcomes in Virtual Learning Environments (VLEs) with findings that traditional motion tracking of participants wearing head mounted displays in virtual reality increases both physical and social presence. This information suggests that motion tracking in mixed reality VLEs has a positive impact on social presence and on physical presence. For this study, the affordance of free movement among virtual objects is enabled by Microsoft Kinect tracking of the user's position that is translated to movement of the virtual camera to simulate user movement and proximity to elements of the virtual environment.This study used a mixed method, multimodal approach including qualitative, subjective, objective, and physiological data to measure social and physical presence. The testbed for this research was TLE TeachLivE(TM), a mixed reality classroom populated with virtual students. The subjective measures are 1) modified Witmer and Singer Questionnaire and 2) Social Presence Instrument (Bailenson, 2002b). The objective measure is a literature based Social Presence Behavioral Coding sheet used to record frequency of occurrences of factors of social presence. Finally, the physiological measure is heart rate as recorded by the MIO Alpha. The primary contribution of this study was that the hypotheses that the affordance of movement in a mixed reality classroom has a positive impact on user perception and experience of a) physical presence and b) social presence in a VLE were supported. This hypothesis was supported in all three measures. The secondary contribution of this research is the literature based Social Presence Behavioral Coding. The final contribution of this research is a research framework that integrates subjective, objective, and physiological measures of social presence in one study. This approach can be applied to various user experience research studies of various VLEs. Finally, in addition to general alignment of the physiological, objective, and subjective measures, there were anecdotal instances of factors of social presence occurring simultaneously with increased heart rate.
Show less - Date Issued
- 2015
- Identifier
- CFE0006220, ucf:51061
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006220
- Title
- An SoS Conceptual Model, LVC Simulation Framework, and a Prototypical Implementation of Unmanned System Interventions for Nuclear Power Plant Disaster Preparedness, Response, and Mitigation.
- Creator
-
Davis, Matthew, Proctor, Michael, O'Neal, Thomas, Reilly, Charles, Sulfredge, C., Smith, Roger, University of Central Florida
- Abstract / Description
-
Nuclear power plant disasters can have severe and far-reaching consequences, thus emergency managers and first responders from utility owners to the DoD must be prepared to respond to and mitigate effects protecting the public and environment from further damage. Rapidly emerging unmanned systems promise significant improvement in response and mitigation of nuclear disasters. Models and simulations (M(&)S) may play a significant role in improving readiness and reducing risks through its use...
Show moreNuclear power plant disasters can have severe and far-reaching consequences, thus emergency managers and first responders from utility owners to the DoD must be prepared to respond to and mitigate effects protecting the public and environment from further damage. Rapidly emerging unmanned systems promise significant improvement in response and mitigation of nuclear disasters. Models and simulations (M(&)S) may play a significant role in improving readiness and reducing risks through its use in planning, analysis, preparation training, and mitigation rehearsal for a wide spectrum of derivate scenarios. Legacy nuclear reactor M(&)S lack interoperability between themselves and avatar or agent-based simulations of emergent unmanned systems. Bridging the gap between past and the evolving future, we propose a conceptual model (CM) using a System of System (SoS) approach, a simulation federation framework capable of supporting concurrent and interoperating live, virtual and constructive simulation (LVC), and demonstrate a prototypical implementation of an unmanned system intervention for nuclear power plant disaster using the constructive simulation component. The SoS CM, LVC simulation framework, and prototypical implementation are generalizable to other preparedness, response, and mitigation scenarios. The SoS CM broadens the current stovepipe reactor-based simulations to a system-of-system perspective. The framework enables distributed interoperating simulations with a network of legacy and emergent avatar and agent simulations. The unmanned system implementation demonstrates feasibility of the SoS CM and LVC framework through replication of selective Fukushima events. Further, the system-of-systems approach advances life cycle stages including concept exploration, system design, engineering, training, and mission rehearsal. Live, virtual, and constructive component subsystems of the CM are described along with an explanation of input/output requirements. Finally, applications to analysis and training, an evaluation of the SoS CM based on recently proposed criteria found in the literature, and suggestions for future research are discussed.
Show less - Date Issued
- 2017
- Identifier
- CFE0006732, ucf:51879
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006732
- Title
- Calibrating a System Dynamic Model Within an Integrative Framework to Test Foreign Policy Choices.
- Creator
-
Kavetsky, Carlos, Morrow, Patricia Bockelman, Wiegand, Rudolf, Wu, Annie, Akbas, Ilhan, University of Central Florida
- Abstract / Description
-
Political science uses international relations (IR) theory to explain state-actor political behavior. Research suggests that this theoretical framework can inform a predictive model incorporating features of systems dynamics (SD) and agent based (AB) modeling. The Foreign Policy Model (ForPol) herein applies Alexander Y. Lubyansky's (2014) SD model for macro-political behavior to represent behaviors between real systems and mental models. While verifying and validating the resulting SD/AB/IR...
Show morePolitical science uses international relations (IR) theory to explain state-actor political behavior. Research suggests that this theoretical framework can inform a predictive model incorporating features of systems dynamics (SD) and agent based (AB) modeling. The Foreign Policy Model (ForPol) herein applies Alexander Y. Lubyansky's (2014) SD model for macro-political behavior to represent behaviors between real systems and mental models. While verifying and validating the resulting SD/AB/IR holistic model requires an extensive comprehensive research agenda, the present work will take a closer examination at input parameter calibration and conducting typical runs of the SD portion of the model as a first step in the testing, verification and validation process of the proposed integrative model. This thesis proposes incorporating an AB paradigm drawn from work by Claudio Cioffi-Revilla (2009), Edward P. MacKerrow (2003), David L. Rousseau (2006), Joshua M. Epstein and Robert Axtell (1996) as a future hybrid extension.The model applies a SD approach for the modeling of macro-political aggregate behavior. Therefore, the deep analysis of the SD portion of ForPol is modeled and calibrated in Vensim, using empirical data from the 1967 Arab-Israeli Six Day War as a pilot. Interactions within the model actualize Choucri, et. al. (2006), definition of state stability and agent behavior aspects of Cioffi-Revilla's (2009) SimPol polity model. Following calibration results discussion, the present work closes with consideration of future research directions.
Show less - Date Issued
- 2017
- Identifier
- CFE0006750, ucf:51848
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006750
- Title
- An Empirical Evaluation of an Instrument to Determine the Relationship Between Second-Year Medical Students' Perceptions of NERVE VP Design Effectiveness and Students' Ability to Learn and Transfer Skills from NERVE.
- Creator
-
Reyes, Ramsamooj, Hirumi, Atsusi, Sivo, Stephen, Campbell, Laurie, Cendan, Juan, University of Central Florida
- Abstract / Description
-
Meta-analyses and systematic reviews of literature comparing the use of virtual patients (VPs) to traditional educational methods support the efficacy of VPs (Cook, Erwin, (&) Triola, 2010; Cook (&) Triola, 2009; McGaghie, Issenberg, Cohen, Barsuk, (&) Wayne, 2011). However, VP design research has produced a variety of design features (Bateman, Allen, Samani, Kidd, (&) Davies, 2013; Botezatu, Hult, (&) Fors, 2010a; Huwendiek (&) De Leng, 2010), frameworks (Huwendiek et al., 2009b) and...
Show moreMeta-analyses and systematic reviews of literature comparing the use of virtual patients (VPs) to traditional educational methods support the efficacy of VPs (Cook, Erwin, (&) Triola, 2010; Cook (&) Triola, 2009; McGaghie, Issenberg, Cohen, Barsuk, (&) Wayne, 2011). However, VP design research has produced a variety of design features (Bateman, Allen, Samani, Kidd, (&) Davies, 2013; Botezatu, Hult, (&) Fors, 2010a; Huwendiek (&) De Leng, 2010), frameworks (Huwendiek et al., 2009b) and principles (Huwendiek et al., 2009a) that are similar in nature, but appear to lack consensus. Consequently, researchers are not sure which VP design principles to apply and few validated guidelines are available. To address this situation, Huwendiek et al. (2014) validated an instrument to evaluate the design of VP simulations that focuses on fostering clinical reasoning. This dissertation examines the predictive validity of one instrument proposed by Huwendiek et al. (2014) that examines VP design features. Empirical research provides evidence for the reliability and validity of the VP design effectiveness measure. However, the relationship between the design features evaluated by the instrument to criterion-referenced measures of student learning and performance remains to be examined. This study examines the predictive validity of Huwendiek et al.'s (2014) VP design effectiveness measurement instrument by determining if the design factors evaluated by the instrument are correlated to medical students' performance in: (a) quizzes and VP cases embedded in Neurological Examination Rehearsal Virtual Environment (NERVE), and (b) NERVE-assisted virtual patient/standardized patient (VP/SP) differential diagnosis and SP checklists. It was hypothesized that students' perceptions of effectiveness of NERVE VP design are significantly correlated to the achievement of higher student learning and transfer outcomes in NERVE.The confirmatory factor analyses revealed the effectiveness of NERVE VP design was significantly correlated to student learning and transfer. Significant correlations were found between key design features evaluated by the instrument and students' performance on quizzes and VP cases embedded in NERVE. In addition, significant correlations were found between the NERVE VP design factors evaluated by Huwendiek et al.'s (2014) instrument and students' performance in SP checklists. Findings provided empirical evidence supporting the reliability and predictive validity of Huwendiek et al.'s (2014) instrument.Future research should examine additional sources of validity for Huwendiek et al.'s (2014) VP design effectiveness instrument using larger samples and from other socio-cultural backgrounds and continue to examine the predictive validity of Huwendiek et al.'s (2014) instrument at Level 2 (Learning) and Level 3 (Application) of Kirkpatrick's (1975) four-level model of training evaluation.
Show less - Date Issued
- 2016
- Identifier
- CFE0006166, ucf:51150
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006166
- Title
- A Compiler-based Framework for Automatic Extraction of Program Skeletons for Exascale Hardware/Software Co-design.
- Creator
-
Rudraiah Dakshinamurthy, Amruth, Dechev, Damian, Heinrich, Mark, Deo, Narsingh, University of Central Florida
- Abstract / Description
-
The design of high-performance computing architectures requires performance analysis of large-scale parallel applications to derive various parameters concerning hardware design and software development. The process of performance analysis and benchmarking an application can be done in several ways with varying degrees of fidelity. One of the most cost-effective ways is to do a coarse-grained study of large-scale parallel applications through the use of program skeletons. The concept of a `...
Show moreThe design of high-performance computing architectures requires performance analysis of large-scale parallel applications to derive various parameters concerning hardware design and software development. The process of performance analysis and benchmarking an application can be done in several ways with varying degrees of fidelity. One of the most cost-effective ways is to do a coarse-grained study of large-scale parallel applications through the use of program skeletons. The concept of a ``program skeleton'' that we discuss in this paper is an abstracted program that is derived from a larger program where source code that is determined to be irrelevant is removed for the purposes of the skeleton. In this work, we develop a semi-automatic approach for extracting program skeletons based on compiler program analysis. We demonstrate correctness of our skeleton extraction process by comparing details from communication traces, as well as show the performance speedup of using skeletons by running simulations in the SST/macro simulator. Extracting such a program skeleton from a large-scale parallel program requires a substantial amount of manual effort and often introduces human errors. We outline a semi-automatic approach for extracting program skeletons from large-scale parallel applications that reduces cost and eliminates errors inherent in manual approaches. Our skeleton generation approach is based on the use of the extensible and open-source ROSE compiler infrastructure that allows us to perform flow and dependency analysis on larger programs in order to determine what code can be removed from the program to generate a skeleton.
Show less - Date Issued
- 2013
- Identifier
- CFE0004743, ucf:49795
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004743
- Title
- Combustion Instability Mechanism of a Reacting Jet in Cross Flow at Gas Turbine Operating Conditions.
- Creator
-
Pent, Jared, Kapat, Jayanta, Deng, Weiwei, Gordon, Ali, Vasu Sumathi, Subith, Martin, Scott, University of Central Florida
- Abstract / Description
-
Modern gas turbine designs often include lean premixed combustion for its emissions benefits; however, this type of combustion process is susceptible to self-excited combustion instabilities that can lead to damaging heat loads and system vibrations. This study focuses on identifying a mechanism of combustion instability of a reacting jet in cross flow, a flow feature that is widely used in the design of gas turbine combustion systems. Experimental results from a related study are used to...
Show moreModern gas turbine designs often include lean premixed combustion for its emissions benefits; however, this type of combustion process is susceptible to self-excited combustion instabilities that can lead to damaging heat loads and system vibrations. This study focuses on identifying a mechanism of combustion instability of a reacting jet in cross flow, a flow feature that is widely used in the design of gas turbine combustion systems. Experimental results from a related study are used to validate and complement three numerical tools that are applied in this study (-) self-excited Large Eddy Simulations, 3D thermoacoustic modeling, and 1D instability modeling. Based on the experimental and numerical results, a mechanism was identified that included a contribution from the jet in cross flow impedance as well as an overall jet flame time lag. The jet impedance is simply a function of the acoustic properties of the geometry while the flame time lag can be separated into jet velocity, equivalence ratio, and strain fluctuations, depending on the operating conditions and setup. For the specific application investigated in this study, it was found that the jet velocity and equivalence ratio fluctuations are important, however, the effect of the strain fluctuations on the heat release are minimal due to the high operating pressure. A mathematical heat release model was derived based on the proposed mechanism and implemented into a 3D thermoacoustic tool as well as a 1D instability tool. A three-point stability trend observed in the experimental data was correctly captured by the 3D thermoacoustic tool using the derived heat release model. Stability maps were generated with the 1D instability tool to demonstrate regions of stable operation that can be achieved as a function of the proposed mechanism parameters. The relative effect of the reacting jet in cross flow on the two dominant unstable modes was correctly captured in the stability maps. While additional mechanisms for a reacting jet in cross flow are possible at differing flow conditions, the mechanism proposed in this study was shown to correctly replicate the stability trends observed in the experimental tests and provides a fundamental understanding that can be applied for combustion system design.
Show less - Date Issued
- 2014
- Identifier
- CFE0005687, ucf:50154
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005687
- Title
- Mitigation of Motion Sickness Symptoms in 360(&)deg; Indirect Vision Systems.
- Creator
-
Quinn, Stephanie, Rinalducci, Edward, Hancock, Peter, Mouloua, Mustapha, French, Jonathan, Chen, Jessie, Kennedy, Robert, University of Central Florida
- Abstract / Description
-
The present research attempted to use display design as a means to mitigate the occurrence and severity of symptoms of motion sickness and increase performance due to reduced (")general effects(") in an uncoupled motion environment. Specifically, several visual display manipulations of a 360(&)deg; indirect vision system were implemented during a target detection task while participants were concurrently immersed in a motion simulator that mimicked off-road terrain which was completely...
Show moreThe present research attempted to use display design as a means to mitigate the occurrence and severity of symptoms of motion sickness and increase performance due to reduced (")general effects(") in an uncoupled motion environment. Specifically, several visual display manipulations of a 360(&)deg; indirect vision system were implemented during a target detection task while participants were concurrently immersed in a motion simulator that mimicked off-road terrain which was completely separate from the target detection route. Results of a multiple regression analysis determined that the Dual Banners display incorporating an artificial horizon (i.e., AH Dual Banners) and perceived attentional control significantly contributed to the outcome of total severity of motion sickness, as measured by the Simulator Sickness Questionnaire (SSQ). Altogether, 33.6% (adjusted) of the variability in Total Severity was predicted by the variables used in the model. Objective measures were assessed prior to, during and after uncoupled motion. These tests involved performance while immersed in the environment (i.e., target detection and situation awareness), as well as postural stability and cognitive and visual assessment tests (i.e., Grammatical Reasoning and Manikin) both before and after immersion. Response time to Grammatical Reasoning actually decreased after uncoupled motion. However, this was the only significant difference of all the performance measures. Assessment of subjective workload (as measured by NASA-TLX) determined that participants in Dual Banners display conditions had a significantly lower level of perceived physical demand than those with Completely Separated display designs. Further, perceived temporal demand was lower for participants exposed to conditions incorporating an artificial horizon. Subjective sickness (SSQ Total Severity, Nausea, Oculomotor and Disorientation) was evaluated using non-parametric tests and confirmed that the AH Dual Banners display had significantly lower Total Severity scores than the Completely Separated display with no artificial horizon (i.e., NoAH Completely Separated). Oculomotor scores were also significantly different for these two conditions, with lower scores associated with AH Dual Banners. The NoAH Completely Separated condition also had marginally higher oculomotor scores when compared to the Completely Separated display incorporating the artificial horizon (AH Completely Separated). There were no significant differences of sickness symptoms or severity (measured by self-assessment, postural stability, and cognitive and visual tests) between display designs 30- and 60-minutes post-exposure. Further, 30- and 60- minute post measures were not significantly different from baseline scores, suggesting that aftereffects were not present up to 60 minutes post-exposure. It was concluded that incorporating an artificial horizon onto the Dual Banners display will be beneficial in mitigating symptoms of motion sickness in manned ground vehicles using 360(&)deg; indirect vision systems. Screening for perceived attentional control will also be advantageous in situations where selection is possible. However, caution must be made in generalizing these results to missions under terrain or vehicle speed different than what is used for this study, as well as those that include a longer immersion time.
Show less - Date Issued
- 2013
- Identifier
- CFE0005047, ucf:49972
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005047
- Title
- QUANTITATIVE SCANNING TRANSMISSION ELECTRON MICROSCOPY OF THICK SAMPLES AND OF GOLD AND SILVER NANOPARTICLES ON POLYMERIC SURFACES.
- Creator
-
Dutta, Aniruddha, Heinrich, Helge, Del Barco, Enrique, Chow, Lee, Chen, Bo, Kuebler, Stephen, University of Central Florida
- Abstract / Description
-
Transmission Electron Microscopy (TEM) is a reliable tool for chemical and structural studies of nanostructured systems. The shape, size and volumes of nanoparticles on surfaces play an important role in surface chemistry. As nanostructured surfaces become increasingly important for catalysis, protective coatings, optical properties, detection of specific molecules, and many other applications, different techniques of TEM can be used to characterize the properties of nanoparticles on surfaces...
Show moreTransmission Electron Microscopy (TEM) is a reliable tool for chemical and structural studies of nanostructured systems. The shape, size and volumes of nanoparticles on surfaces play an important role in surface chemistry. As nanostructured surfaces become increasingly important for catalysis, protective coatings, optical properties, detection of specific molecules, and many other applications, different techniques of TEM can be used to characterize the properties of nanoparticles on surfaces to provide a path for predictability and control of these systems.This dissertation aims to provide fundamental understanding of the surface chemistry of Electroless Metallization onto Polymeric Surfaces (EMPS) through characterization with TEM. The research focuses on a single EMPS system: deposition of Ag onto the cross-linked epoxide (")SU8("), where Au nanoparticles act as nucleation sites for the growth of Ag nanoparticles on the polymer surface. TEM cross sections were analyzed to investigate the morphology of the Au nanoparticles and to determine the thicknesses of the Ag nanoparticles and of the Ag layers. A method for the direct measurement of the volume and thickness of nanomaterials has been developed in the project using High-Angle Annular Dark-Field (HAADF) Scanning Transmission Electron Microscopy (STEM). The morphology of Au and Ag NPs has been studied to provide reliable statistics for 3-D characterization. Deposition rates have been obtained as a function of metallization conditions by measuring the composition and thickness of the metal for EMPS. In the present work a calibration method was used to quantify the sensitivity of the HAADF detector. For thin samples a linear relationship of the HAADF signal with the thickness of a material is found. Cross-sections of multilayered samples provided by Triquint Semiconductors, FL, were analyzed as calibration standards with known composition in a TECNAI F30 transmission electron microscope to study the dependence of the HAADF detector signal on sample thickness and temperature.Dynamical diffraction processes play an important role in electron scattering for larger sample thicknesses. The HAADF detector intensity is not linearly dependent on sample thicknesses for thick samples. This phenomenon involves several excitation processes including Thermal Diffuse Scattering (TDS) which depends on temperature-dependent absorption coefficients. Multislice simulations have been carried out by Python programming using the scattering parameters available in the literature. These simulations were compared with experimental results. Wedge-shaped Focused Ion Beam (FIB) samples were prepared for quantitative HAADF-STEM intensity measurements for several samples and compared with these simulations. The discrepancies between the simulated and experimental results were explained and new sets of absorptive parameters were calculated which correctly account for the HAADF-STEM contrasts. A database of several pure elements is compiled to illustrate the absorption coefficients and fractions of scattered electrons per nanometer of the sample.In addition, the wedge-shaped FIB samples were used for studying the HAADF-STEM contrasts at an interface of a high- and a low-density material. The use of thick samples reveals an increased signal at the interfaces of high- and low-density materials. This effect can be explained by the transfer of scattered electrons from the high density material across the interface into the less-absorbing low-density material. A ballistic scattering model is proposed here for the HAADF-STEM contrasts at interfaces of thick materials using Python. The simulated HAADF-STEM signal is compared with experimental data to showcase the above phenomenon. A detailed understanding of the atomic number contrast in thick samples is developed based on the combination of experimental quantitative HAADF-STEM and simulated scattering. This approach is used to describe the observed features for Ag deposition on SU-8 polymers.
Show less - Date Issued
- 2014
- Identifier
- CFE0005485, ucf:50333
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005485
- Title
- A New Multidimensional Psycho-Physical Framework for Modeling Car-Following in a Freeway Work Zone.
- Creator
-
Lochrane, Taylor, Al-Deek, Haitham, Radwan, Essam, Oloufa, Amr, Harb, Rami, Uddin, Nizam, University of Central Florida
- Abstract / Description
-
As the United States continues to build and repair the ageing highway infrastructure, the bearing of freeway work zones will continue to impact the capacity. To predict the capacity of a freeway work zone, there are several tools available for engineers to evaluate these work zones but only microsimulation has the ability to simulate the driver behavior. One of the limitations of current car-following models is that they only account for one overall behavioral condition. This dissertation...
Show moreAs the United States continues to build and repair the ageing highway infrastructure, the bearing of freeway work zones will continue to impact the capacity. To predict the capacity of a freeway work zone, there are several tools available for engineers to evaluate these work zones but only microsimulation has the ability to simulate the driver behavior. One of the limitations of current car-following models is that they only account for one overall behavioral condition. This dissertation hypothesizes that drivers change their driving behavior as they drive through a freeway work zone compared to normal freeway conditions which has the potential to impact traffic operations and capacity of work zones. Psycho-physical car-following models are widely used in practice for simulating car-following. However, current simulation models may not fully capture car-following driver behavior specific to freeway work zones. This dissertation presents a new multidimensional psycho-physical framework for modeling car-following based on statistical evaluation of work zone and non-work zone driver behavior. This new framework is close in character to the Wiedemann model used in popular traffic simulation software such as VISSIM. This dissertation used two methodologies for collecting data: (1) a questionnaire to collect demographics and work zone behavior data and (2) a real-time vehicle data from a field experiment involving human participants. It is hypothesized that the parameters needed to calibrate the multidimensional framework for work zone driver behavior can be derived statistically by using data collected from runs of an Instrumented Research Vehicle (IRV) in a Living Laboratory (LL) along a roadway. The design of this LL included the development of an Instrumented Research Vehicle (IRV) to capture the natural car-following response of a driver when entering and passing through a freeway work zone. The development of a Connected Mobile Traffic Sensing (CMTS) system, which included state-of-the-art ITS technologies, supports the LL environment by providing the connectivity, interoperability and data processing of the natural, real-life setting. The IRV and CMTS system are tools designed to support the concept of a LL which facilitates the experimental environment to capture and calibrate natural driver behavior. The objective is to have these participants drive the instrumented vehicle and collect the relative distance and the relative velocity between the instrumented vehicle and the vehicle in the front of the instrumented vehicle. A Phase I pilot test was conducted with 10 participants to evaluate the experiment and make any adjustments prior to the full Phase II driver test. The Phase II driver test recruited a group of 64 participants to drive the IRV through an LL set up along a work zone on I-95 near Washington D.C. in order to validate this hypothesis In this dissertation, a new framework was applied and it demonstrated that there are four different categories of car-following behavior models each with different parameter distributions. The four categories are divided by traffic condition (congested vs. uncongested) and by roadway condition (work zone vs. non-work zone). The calibrated threshold values are presented for each of these four categories. By applying this new multidimensional framework, modeling of car-following behavior can enhance vehicle behavior in microsimulation modeling.This dissertation also explored driver behavior through combining vehicle data and survey techniques to augment the model calibrations to improve the understanding of car-following behavior in freeway work zones. The results identify a set of survey questions that can potentially guide the selection of parameters for car-fallowing models. The findings presented in this dissertation can be used to improve the performance of driver behavior models specific to work zones. This in return will more acutely forecast the impact a work zone design has on capacity during congestion.
Show less - Date Issued
- 2014
- Identifier
- CFE0005521, ucf:50326
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005521
- Title
- Modeling of Socio-Economic Factors and Adverse Events In an Active War Theater By Using a Cellular Automata Simulation Approach.
- Creator
-
Bozkurt, Halil, Karwowski, Waldemar, Lee, Gene, Thompson, William, Mikusinski, Piotr, University of Central Florida
- Abstract / Description
-
Department of Defense (DoD) implemented Human Social Cultural and Behavior (HSCB) program to meet the need to develop capability to understand, predict and shape human behavior among different cultures by developing a knowledge base, building models, and creating training capacity. This capability will allow decision makers to subordinate kinetic operations and promote non-kinetic operations to govern economic programs better in order to initiate efforts and development to address the...
Show moreDepartment of Defense (DoD) implemented Human Social Cultural and Behavior (HSCB) program to meet the need to develop capability to understand, predict and shape human behavior among different cultures by developing a knowledge base, building models, and creating training capacity. This capability will allow decision makers to subordinate kinetic operations and promote non-kinetic operations to govern economic programs better in order to initiate efforts and development to address the grievances among the displeased by adverse events. These non-kinetic operations include rebuilding indigenous institutions' bottom-up economic activity and constructing necessary infrastructure since the success in non-kinetic operations depends on understanding and using social and cultural landscape. This study aims to support decision makers by building a computational model to understand economic factors and their effect on adverse events.In this dissertation, the analysis demonstrates that the use of cellular automata has several significant contributions to support decision makers allocating development funds to stabilize regions with higher adverse event risks, and to better understand the complex socio-economic interactions with adverse events. Thus, this analysis was performed on a set of spatial data representing factors from social and economic data. In studying behavior using cellular automata, cells in the same neighborhood synchronously interact with each other to determine their next states, and small changes in iteration may yield to complex formations of adverse event risk after several iterations of time. The modeling methodology of cellular automata for social and economic analysis in this research was designed in two major implementation levels as follows: macro and micro-level. In the macro-level, the modeling framework integrates population, social, and economic sub-systems. The macro-level allows the model to use regionalized representations, while the micro-level analyses help to understand why the events have occurred. Macro-level subsystems support cellular automata rules to generate accurate predictions. Prediction capability of cellular automata is used to model the micro-level interactions between individual actors, which are represented by adverse events.The results of this dissertation demonstrate that cellular automata model is capable of evaluating socio-economic influences that result in changes in adverse events and identify location, time and impact of these events. Secondly, this research indicates that the socio-economic influences have different levels of impact on adverse events, defined by the number of people killed, wounded or hijacked. Thirdly, this research shows that the socio-economic, influences and adverse events that occurred in a given district have impacts on adverse events that occur in neighboring districts. The cellular automata modeling approach can be used to enhance the capability to understand and use human, social and behavioral factors by generating what-if scenarios to determine the impact of different infrastructure development projects to predict adverse events. Lastly, adverse events that could occur in upcoming years can be predicted to allow decision makers to deter these events or plan accordingly if these events do occur.
Show less - Date Issued
- 2013
- Identifier
- CFE0004820, ucf:49719
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004820
- Title
- Real-time traffic safety evaluation models and their application for variable speed limits.
- Creator
-
Yu, Rongjie, Abdel-Aty, Mohamed, Radwan, Ahmed, Madani Larijani, Kaveh, Ahmed, Mohamed, Wang, Xuesong, University of Central Florida
- Abstract / Description
-
Traffic safety has become the first concern in the transportation area. Crashes have cause extensive human and economic losses. With the objective of reducing crash occurrence and alleviating crash injury severity, major efforts have been dedicated to reveal the hazardous factors that affect crash occurrence at both the aggregate (targeting crash frequency per segment, intersection, etc.,) and disaggregate levels (analyzing each crash event). The aggregate traffic safety studies, mainly...
Show moreTraffic safety has become the first concern in the transportation area. Crashes have cause extensive human and economic losses. With the objective of reducing crash occurrence and alleviating crash injury severity, major efforts have been dedicated to reveal the hazardous factors that affect crash occurrence at both the aggregate (targeting crash frequency per segment, intersection, etc.,) and disaggregate levels (analyzing each crash event). The aggregate traffic safety studies, mainly developing safety performance functions (SPFs), are being conducted for the purpose of unveiling crash contributing factors for the interest locations. Results of the aggregate traffic safety studies can be used to identify crash hot spots, calculate crash modification factors (CMF), and improve geometric characteristics. Aggregate analyses mainly focus on discovering the hazardous factors that are related to the frequency of total crashes, of specific crash type, or of each crash severity level. While disaggregate studies benefit from the reliable surveillance systems which provide detailed real-time traffic and weather data. This information could help in capturing microlevel influences of the hazardous factors which might lead to a crash. The disaggregate traffic safety models, also called real-time crash risk evaluation models, can be used in monitoring crash hazardousness with the real-time field data fed in. One potential use of real-time crash risk evaluation models is to develop Variable Speed Limits (VSL) as a part of a freeway management system. Models have been developed to predict crash occurrence to proactively improve traffic safety and prevent crash occurrence.In this study, first, aggregate safety performance functions were estimated to unveil the different risk factors affecting crash occurrence for a mountainous freeway section. Then disaggregate real-time crash risk evaluation models have been developed for the total crashes with both the machine learning and hierarchical Bayesian models. Considering the need for analyzing both aggregate and disaggregate aspects of traffic safety, systematic multi-level traffic safety studies have been conducted for single- and multi-vehicle crashes, and weekday and weekend crashes. Finally, the feasibility of utilizing a VSL system to improve traffic safety on freeways has been investigated. This research was conducted based on data obtained from a 15-mile mountainous freeway section on I-70 in Colorado. The data contain historical crash data, roadway geometric characteristics, real-time weather data, and real-time traffic data. Real-time weather data were recorded by 6 weather stations installed along the freeway section, while the real-time traffic data were obtained from the Remote Traffic Microwave Sensor (RTMS) radars and Automatic Vechicle Identification (AVI) systems. Different datasets have been formulated from various data sources, and prepared for the multi-level traffic safety studies. In the aggregate traffic safety investigation, safety performance functions were developed to identify crash occurrence hazardous factors. For the first time real-time weather and traffic data were used in SPFs. Ordinary Poisson model and random effects Poisson models with Bayesian inference approach were employed to reveal the effects of weather and traffic related variables on crash occurrence. Two scenarios were considered: one seasonal based case and one crash type based case. Deviance Information Criterion (DIC) was utilized as the comparison criterion; and the correlated random effects Poisson models outperform the others. Results indicate that weather condition variables, especially precipitation, play a key role in the safety performance functions. Moreover, in order to compare with the correlated random effects Poisson model, Multivariate Poisson model and Multivariate Poisson-lognormal model have been estimated. Conclusions indicate that, instead of assuming identical random effects for the homogenous segments, considering the correlation effects between two count variables would result in better model fit. Results from the aggregate analyses shed light on the policy implication to reduce crash frequencies. For the studied roadway segment, crash occurrence in the snow season have clear trends associated with adverse weather situations (bad visibility and large amount of precipitation); weather warning systems can be employed to improve road safety during the snow season. Furthermore, different traffic management strategies should be developed according to the distinct seasonal influence factors. In particular, sites with steep slopes need more attention from the traffic management center and operators especially during snow seasons to control the excess crash occurrence. Moreover, distinct strategy of freeway management should be designed to address the differences between single- and multi-vehicle crash characteristics.In addition to developing safety performance functions with various modeling techniques, this study also investigates four different approaches of developing informative priors for the independent variables. Bayesian inference framework provides a complete and coherent way to balance the empirical data and prior expectations; merits of these informative priors have been tested along with two types of Bayesian hierarchical models (Poisson-gamma and Poisson-lognormal models). Deviance Information Criterion, R-square values, and coefficients of variance for the estimations were utilized as evaluation measures to select the best model(s). Comparisons across the models indicate that the Poisson-gamma model is superior with a better model fit and it is much more robust with the informative priors. Moreover, the two-stage Bayesian updating informative priors provided the best goodness-of-fit and coefficient estimation accuracies.In addition to the aggregate analyses, real-time crash risk evaluation models have been developed to identify crash contributing factors at the disaggregate level. Support Vector Machine (SVM), a recently proposed statistical learning model and Hierarchical Bayesian logistic regression models were introduced to evaluate real-time crash risk. Classification and regression tree (CART) model has been developed to select the most important explanatory variables. Based on the variable selection results, Bayesian logistic regression models and SVM models with different kernel functions have been developed. Model comparisons based on receiver operating curves (ROC) demonstrate that the SVM model with Radial basis kernel function outperforms the others. Results from the models demonstrated that crashes are likely to happen during congestion periods (especially when the queuing area has propagated from the downstream segment); high variation of occupancy and/or volume would increase the probability of crash occurrence.Moreover, effects of microscopic traffic, weather, and roadway geometric factors on the occurrence of specific crash types have been investigated. Crashes have been categorized as rear-end, sideswipe, and single-vehicle crashes. AVI segment average speed, real-time weather data, and roadway geometric characteristics data were utilized as explanatory variables. Conclusions from this study imply that different active traffic management (ATM) strategies should be designed for three- and two-lane roadway sections and also considering the seasonal effects. Based on the abovementioned results, real-time crash risk evaluation models have been developed separately for multi-vehicle and single-vehicle crashes, and weekday and weekend crashes. Hierarchical Bayesian logistic regression models (random effects and random parameter logistic regression models) have been introduced to address the seasonal variations, crash unit level's diversities, and unobserved heterogeneity caused by geometric characteristics. For the multi-vehicle crashes: congested conditions at downstream would contribute to an increase in the likelihood of multi-vehicle crashes; multi-vehicle crashes are more likely to occur during poor visibility conditions and if there is a turbulent area that exists downstream. Drivers who are unable to reduce their speeds timely are prone to causing rear-end crashes. While for the single-vehicle crashes: slow moving traffic platoons at the downstream detector of the crash occurrence locations would increase the probability of single-vehicle crashes; large variations of occupancy downstream would also increase the likelihood of single-vehicle crash occurrence.Substantial efforts have been dedicated to revealing the hazardous factors that affect crash occurrence from both the aggregate and disaggregate level in this study, however, findings and conclusions from these research work need to be transferred into applications for roadway design and freeway management. This study further investigates the feasibility of utilizing Variable Speed Limits (VSL) system, one key part of ATM, to improve traffic safety on freeways. A proactive traffic safety improvement VSL control algorithm has been proposed. First, an extension of the traffic flow model METANET was employed to predict traffic flow while considering VSL's impacts on the flow-density diagram; a real-time crash risk evaluation model was then estimated for the purpose of quantifying crash risk; finally, the optimal VSL control strategies were achieved by employing an optimization technique of minimizing the total predicted crash risks along the VSL implementation area. Constraints were set up to limit the increase of the average travel time and differences between posted speed limits temporarily and spatially. The proposed VSL control strategy was tested for a mountainous freeway bottleneck area in the microscopic simulation software VISSIM. Safety impacts of the VSL system were quantified as crash risk improvements and speed homogeneity improvements. Moreover, three different driver compliance levels were modeled in VISSIM to monitor the sensitivity of VSL's safety impacts on driver compliance levels. Conclusions demonstrate that the proposed VSL system could effectively improve traffic safety by decreasing crash risk, enhancing speed homogeneity, and reducing travel time under both high and moderate driver compliance levels; while the VSL system does not have significant effects on traffic safety enhancement under the low compliance scenario. Future implementations of VSL control strategies and related research topics were also discussed.
Show less - Date Issued
- 2013
- Identifier
- CFE0005283, ucf:50556
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005283
- Title
- Instructional Strategies for Scenario-Based Training of Human Behavior Cue Analysis with Robot-Aided Intelligence, Surveillance, Reconnaissance.
- Creator
-
Salcedo, Julie, Lackey, Stephanie, Reinerman, Lauren, Barber, Daniel, Kincaid, John, Matthews, Gerald, University of Central Florida
- Abstract / Description
-
The U.S. Army desires to improve safety during Intelligence, Surveillance, Reconnaissance (ISR) operations by removing Warfighters from direct line-of-fire by enhancing ISR operational capabilities with unmanned systems, also known as Robot-Aided ISR (RAISR) (DOD, 2013). Additionally, RAISR presents an opportunity to fulfill ISR capability requirements of modern combat environments including: detection of High-Value Individuals (HVI) from safer distances, identification of baseline behavior,...
Show moreThe U.S. Army desires to improve safety during Intelligence, Surveillance, Reconnaissance (ISR) operations by removing Warfighters from direct line-of-fire by enhancing ISR operational capabilities with unmanned systems, also known as Robot-Aided ISR (RAISR) (DOD, 2013). Additionally, RAISR presents an opportunity to fulfill ISR capability requirements of modern combat environments including: detection of High-Value Individuals (HVI) from safer distances, identification of baseline behavior, and interpretation of adversarial intent (U.S. Army, 2008). Along with the demand and projected acquisition of RAISR technology, there is the added need to design training requirements for system operation and task execution instruction. While documentation identifying specific training standards and objectives for ISR tasks utilizing unmanned systems is limited (DOD, 2013), simulation-based training has been identified as a critical training medium for RAISR (U.S. Army, 2008). ISR analysts will primarily conduct RAISR tasks via Indirect Vision Displays (IVD) which transition well into multimodal simulations (Salcedo, Lackey, (&) Maraj, 2014). However, simulation alone may not fulfill the complex training needs of RAISR tasks, therefore, incorporating instructional support may improve the effectiveness of training (Oser, Gualtieri, Cannon-Bowers, (&) Salas, 1999). One method to accomplish this is to utilize a Scenario-Based Training (SBT) framework enhanced with instructional strategies to target specific training objectives.The purpose for the present experiment was to assess the effectiveness of SBT enhanced with selected instructional strategies for a PC-based RAISR training simulation. The specific task type was the identification of HVIs within a group through behavior cue analysis. The instructional strategies assessed in this experiment, Highlighting and Massed Exposure, have shown to improve attentional weighting, visual search, and pattern recognition skills, which are critical for successful behavior cue analysis. Training effectiveness was evaluated by analyzing the impact of the instructional strategies on performance outcomes, including detection accuracy, classification accuracy, and median response time, and perceptions of the level of engagement, immersion, and presence during training exercises. Performance results revealed that the Massed Exposure strategy produced significantly faster response times for one subtle and one familiar target behavior cue. Perception results indicated that Highlighting was the least challenging instructional strategy and the Control offered the preferred level of challenge. The relationships between performance and perception measures revealed that higher levels of engagement, immersion, and presence were associated with better performance in the Control, but this trend did not always hold for Massed Exposure and Highlighting. Furthermore, presence emerged as the primary predictor of performance for select target behavior cues in the Control and Massed Exposure conditions, while immersion and engagement predicted performance of select cues in the Highlighting condition. The findings of the present experiment point to the potential benefit of SBT instructional strategies to improve effectiveness of simulation-based training for behavior cue analysis during RAISR operations. Specifically, the findings suggest that the Massed Exposure strategy has the potential to improve response time when detecting both familiar and novel targets. The results also highlight directions for future research to investigate methods to alter instructional strategy design and delivery in order to improve trainee perceptions of the instruction.
Show less - Date Issued
- 2014
- Identifier
- CFE0005705, ucf:50151
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005705
- Title
- A Methodology on Weapon Combat Effectiveness Analytics using Big Data and Live, Virtual, or/and Constructive Simulations.
- Creator
-
Jung, Won Il, Lee, Gene, Rabelo, Luis, Elshennawy, Ahmad, Ahmad, Ali, University of Central Florida
- Abstract / Description
-
The Weapon Combat Effectiveness (WCE) analytics is very expensive, time-consuming, and dangerous in the real world because we have to create data from the real operations with a lot of people and weapons in the actual environment. The Modeling and Simulation (M(&)S) of many techniques is used for overcoming these limitations. Although the era of big data has emerged and achieved a great deal of success in a variety of fields, most WCE research using the Defense Modeling and Simulation (DM(&)S...
Show moreThe Weapon Combat Effectiveness (WCE) analytics is very expensive, time-consuming, and dangerous in the real world because we have to create data from the real operations with a lot of people and weapons in the actual environment. The Modeling and Simulation (M(&)S) of many techniques is used for overcoming these limitations. Although the era of big data has emerged and achieved a great deal of success in a variety of fields, most WCE research using the Defense Modeling and Simulation (DM(&)S) techniques were studied without the help of big data technologies and techniques. The existing research has not considered various factors affecting WCE. This is because current research has been restricted by only using constructive simulation, a single weapon system, and limited scenarios. Therefore, the WCE analytics using existing methodologies have also incorporated the same limitations, and therefore, cannot help but get biased results.To solve the above problem, this dissertation is to initially review and compose the basic knowledge for the new WCE analytics methodology using big data and DM(&)S to further serve as the stepping-stone of the future research for the interested researchers. Also, this dissertation presents the new methodology on WCE analytics using big data generated by Live, Virtual, or/and Constructive (LVC) simulations. This methodology can increase the fidelity of WCE analytics results by considering various factors. It can give opportunities for application of weapon acquisition, operations analytics and plan, and objective level development on each training factor for the weapon operators according to the selection of Measures of Effectiveness (MOEs) and Measures of Performance (MOPs), or impact factors, based on the analytics goal.
Show less - Date Issued
- 2018
- Identifier
- CFE0007025, ucf:52870
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007025
- Title
- An Agile Roadmap for Live, Virtual and Constructive-Integrating Training Architecture (LVC-ITA): A Case Study Using a Component based Integrated Simulation Engine.
- Creator
-
Park, Tae Woong, Lee, Gene, Rabelo, Luis, Elshennawy, Ahmad, Kincaid, John, University of Central Florida
- Abstract / Description
-
Conducting seamless Live Virtual Constructive (LVC) simulation remains the most challenging issue of Modeling and Simulation (M(&)S). There is a lack of interoperability, limited reuse and loose integration between the Live, Virtual and/or Constructive assets across multiple Standard Simulation Architectures (SSAs). There have been various theoretical research endeavors about solving these problems but their solutions resulted in complex and inflexible integration, long user-usage time and...
Show moreConducting seamless Live Virtual Constructive (LVC) simulation remains the most challenging issue of Modeling and Simulation (M(&)S). There is a lack of interoperability, limited reuse and loose integration between the Live, Virtual and/or Constructive assets across multiple Standard Simulation Architectures (SSAs). There have been various theoretical research endeavors about solving these problems but their solutions resulted in complex and inflexible integration, long user-usage time and high cost for LVC simulation. The goal of this research is to provide an Agile Roadmap for the Live Virtual Constructive-Integrating Training Architecture (LVC-ITA) that will address the above problems and introduce interoperable LVC simulation. Therefore, this research describes how the newest M(&)S technologies can be utilized for LVC simulation interoperability and integration. Then, we will examine the optimal procedure to develop an agile roadmap for the LVC-ITA.In addition, this research illustrated a case study using an Adaptive distributed parallel Simulation environment for Interoperable and reusable Model (AddSIM) that is a component based integrated simulation engine. The agile roadmap of the LVC-ITA that reflects the lessons learned from the case study will contribute to guide M(&)S communities to an efficient path to increase interaction of M(&)S simulation across systems.
Show less - Date Issued
- 2015
- Identifier
- CFE0005682, ucf:52867
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005682
- Title
- Effluent Water Quality Improvement Using Silt Fences and Stormwater Harvesting.
- Creator
-
Gogo-Abite, Ikiensinma, Chopra, Manoj, Wanielista, Martin, Nam, Boo Hyun, Weishampel, John, University of Central Florida
- Abstract / Description
-
Construction sites are among the most common areas to experience soil erosion and sediment transport due to the mandatory foundation tasks such as excavation and land grubbing. Thus, temporary sediment barriers are installed along the perimeter to prevent sediment transport from the site. Erosion and sediment transport control measures may include, but not limited to, physical and chemical processes such as the use of a silt fence and polyacrylamide product. Runoff from construction sites and...
Show moreConstruction sites are among the most common areas to experience soil erosion and sediment transport due to the mandatory foundation tasks such as excavation and land grubbing. Thus, temporary sediment barriers are installed along the perimeter to prevent sediment transport from the site. Erosion and sediment transport control measures may include, but not limited to, physical and chemical processes such as the use of a silt fence and polyacrylamide product. Runoff from construction sites and other impervious surfaces are routinely discharged into ponds for treatment before being released into a receiving water body. Stormwater harvesting from a pond for irrigation of adjacent lands is promoted as one approach to reducing pond discharge while supplementing valuable potable water used for irrigation. The reduction of pond discharge reduces the mass of pollutants in the discharge. In the dissertation, presented is the investigation of the effectiveness of temporary sediment barriers and then, development of a modeling approach to a stormwater harvesting pond to provide a comprehensive stormwater management pollution reduction assessment tool.The first part of the research presents the investigation of the performance efficiencies of silt fence fabrics in turbidity and sediment concentration removal, and the determination of flow-through-rate on simulated construction sites in real time. Two silt fence fabrics, (1) woven and the other (2) nonwoven were subjected to material index property tests and a series of field-scale tests with different rainfall intensities and events for different embankment slopes on a tilting test-bed. Collected influent and effluent samples were analyzed for sediment concentration and turbidity, and the flow-through-rate for each fabric was evaluated. Test results revealed that the woven and nonwoven silt fence achieved 11 and 56 percent average turbidity reduction efficiency, respectively. Each fabric also achieved 20 and 56 percent average sediment concentration removal efficiency, respectively. Fabric flow-through-rates were functions of the rainfall intensity and embankment slope. The nonwoven fabric exhibited higher flow-through-rates than the woven fabric in both field-scale and laboratory tests.In the second part of the study, a Stormwater Harvesting and Assessment for Reduction of Pollution (SHARP) model was developed to predict operation of wet pond used for stormwater harvesting. The model integrates the interaction of surface water and groundwater in a catchment area. The SHARP model was calibrated and validated with actual pond water elevation data from a stormwater pond at Miramar Lakes, Miramar, Florida. Model evaluation showed adequate prediction of pond water elevation with root mean square error between 0.07 and 0.12 m; mean absolute error was between 0.018 and 0.07 m; and relative index of agreement was between 0.74 and 0.98 for both calibration and validation periods. The SHARP model is capable of assessing harvesting safe-yield and discharge from a pond, including the prediction of the percentage of runoff into a harvesting pond that is not discharged.The combination of silt fence and/or polyacrylamide PAM before stormwater harvesting pond in a treatment train for the reduction of pollutants from construction sites has the potential of significantly exceeding a performance standard of 85 percent reduction typically required by local authorities. In fact, the stringent requirement of equaling pre- and post-development pollutant loading is highly achievable by the treatment train approach. The significant contribution from the integration of the SHARP model to the treatment train is that real-time assessment of pollutant loading reduction by volume can be planned and controlled to achieve target performance standards.
Show less - Date Issued
- 2012
- Identifier
- CFE0004539, ucf:49244
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004539
- Title
- MODELING AUTONOMOUS AGENTS IN MILITARY SIMULATIONS.
- Creator
-
Kaptan, Varol, Gelenbe, Erol, University of Central Florida
- Abstract / Description
-
Simulation is an important tool for prediction and assessment of the behavior of complex systems and situations. The importance of simulation has increased tremendously during the last few decades, mainly because the rapid pace of development in the field of electronics has turned the computer from a costly and obscure piece of equipment to a cheap ubiquitous tool which is now an integral part of our daily lives. While such technological improvements make it easier to analyze well-understood...
Show moreSimulation is an important tool for prediction and assessment of the behavior of complex systems and situations. The importance of simulation has increased tremendously during the last few decades, mainly because the rapid pace of development in the field of electronics has turned the computer from a costly and obscure piece of equipment to a cheap ubiquitous tool which is now an integral part of our daily lives. While such technological improvements make it easier to analyze well-understood deterministic systems, increase in speed and storage capacity alone are not enough when simulating situations where human beings and their behavior are an integral part of the system being studied. The problem with simulation of intelligent entities is that intelligence is still not well understood and it seems that the field of Artificial Intelligence (AI) has a long way to go before we get computers to think like humans. Behavior-based agent modeling has been proposed in mid-80's as one of the alternatives to the classical AI approach. While used mainly for the control of specialized robotic vehicles with very specific sensory capabilities and limited intelligence, we believe that a behavior-based approach to modeling generic autonomous agents in complex environments can provide promising results. To this end, we are investigating a behavior-based model for controlling groups of collaborating and competing agents in a geographic terrain. In this thesis, we are focusing on scenarios of military nature, where agents can move within the environment and adversaries can eliminate each other through use of weapons. Different aspects of agent behavior like navigation to a goal or staying in group formation, are implemented by distinct behavior modules and the final observed behavior for each agent is an emergent property of the combination of simple behaviors and their interaction with the environment. Our experiments show that while such an approach is quite efficient in terms of computational power, it has some major drawbacks. One of the problems is that reactive behavior-based navigation algorithms are not well suited for environments with complex mobility constraints where they tend to perform much worse than proper path planning. This problem represents an important research question, especially when it is considered that most of the modern military conflicts and operations occur in urban environments. One of the contributions of this thesis is a novel approach to reactive navigation where goals and terrain information are fused based on the idea of transforming a terrain with obstacles into a virtual obstacle-free terrain. Experimental results show that our approach can successfully combine the low run-time computational complexity of reactive methods with the high success rates of classical path planning. Another interesting research problem is how to deal with the unpredictable nature of emergent behavior. It is not uncommon to have situations where an outcome diverges significantly from the intended behavior of the agents due to highly complex nonlinear interactions with other agents or the environment itself. Chances of devising a formal way to predict and avoid such abnormalities are slim at best, mostly because such complex systems tend to be be chaotic in nature. Instead, we focus on detection of deviations through tracking group behavior which is a key component of the total situation awareness capability required by modern technology-oriented and network-centric warfare. We have designed a simple and efficient clustering algorithm for tracking of groups of agent suitable for both spatial and behavioral domain. We also show how to detect certain events of interest based on a temporal analysis of the evolution of discovered clusters.
Show less - Date Issued
- 2006
- Identifier
- CFE0001494, ucf:47099
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001494
- Title
- MAC LAYER AND ROUTING PROTOCOLS FOR WIRELESS AD HOC NETWORKS WITH ASYMMETRIC LINKS AND PERFORMANCE EVALUATION STUDIES.
- Creator
-
Wang, Guoqiang, Marinescu, Dan, University of Central Florida
- Abstract / Description
-
In a heterogeneous mobile ad hoc network (MANET), assorted devices with different computation and communication capabilities co-exist. In this thesis, we consider the case when the nodes of a MANET have various degrees of mobility and range, and the communication links are asymmetric. Many routing protocols for ad hoc networks routinely assume that all communication links are symmetric, if node A can hear node B and node B can also hear node A. Most current MAC layer protocols are unable to...
Show moreIn a heterogeneous mobile ad hoc network (MANET), assorted devices with different computation and communication capabilities co-exist. In this thesis, we consider the case when the nodes of a MANET have various degrees of mobility and range, and the communication links are asymmetric. Many routing protocols for ad hoc networks routinely assume that all communication links are symmetric, if node A can hear node B and node B can also hear node A. Most current MAC layer protocols are unable to exploit the asymmetric links present in a network, thus leading to an inefficient overall bandwidth utilization, or, in the worst case, to lack of connectivity. To exploit the asymmetric links, the protocols must deal with the asymmetry of the path from a source node to a destination node which affects either the delivery of the original packets, or the paths taken by acknowledgments, or both. Furthermore, the problem of hidden nodes requires a more careful analysis in the case of asymmetric links. MAC layer and routing protocols for ad hoc networks with asymmetric links require a rigorous performance analysis. Analytical models are usually unable to provide even approximate solutions to questions such as end-to-end delay, packet loss ratio, throughput, etc. Traditional simulation techniques for large-scale wireless networks require vast amounts of storage and computing cycles rarely available on single computing systems. In our search for an effective solution to study the performance of wireless networks we investigate the time-parallel simulation. Time-parallel simulation has received significant attention in the past. The advantages, as well as, the theoretical and practical limitations of time-parallel simulation have been extensively researched for many applications when the complexity of the models involved severely limits the applicability of analytical studies and is unfeasible with traditional simulation techniques. Our goal is to study the behavior of large systems consisting of possibly thousands of nodes over extended periods of time and obtain results efficiently, and time-parallel simulation enables us to achieve this objective. We conclude that MAC layer and routing protocols capable of using asymmetric links are more complex than traditional ones, but can improve the connectivity, and provide better performance. We are confident that approximate results for various performance metrics of wireless networks obtained using time-parallel simulation are sufficiently accurate and able to provide the necessary insight into the inner workings of the protocols.
Show less - Date Issued
- 2007
- Identifier
- CFE0001736, ucf:47302
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001736
- Title
- SIGNAL PROCESSING OF AN ECG SIGNALIN THE PRESENCE OF A STRONG STATIC MAGNETIC FIELD.
- Creator
-
Gupta, Aditya, Weeks, Arthur, University of Central Florida
- Abstract / Description
-
This dissertation addresses the problem of elevation of the T wave of an electrocardiogram (ECG) signal in the magnetic resonance imaging (MRI). In the MRI, due to the strong static magnetic field the interaction of the blood flow with this strong magnetic field induces a voltage in the body. This voltage appears as a superimposition at the locus of the T wave of the ECG signal. This looses important information required by the doctors to interpret the ST segment of the ECG and detect...
Show moreThis dissertation addresses the problem of elevation of the T wave of an electrocardiogram (ECG) signal in the magnetic resonance imaging (MRI). In the MRI, due to the strong static magnetic field the interaction of the blood flow with this strong magnetic field induces a voltage in the body. This voltage appears as a superimposition at the locus of the T wave of the ECG signal. This looses important information required by the doctors to interpret the ST segment of the ECG and detect diseases such as myocardial infarction. This dissertation aims at finding a solution to the problem of elevation of the T wave of an ECG signal in the MRI. The first step is to simulate the entire situation and obtain the magnetic field dependent T wave elevation. This is achieved by building a model of the aorta and simulating the blood flow in it. This model is then subjected to a static magnetic field and the surface potential on the thorax is measured to observe the T wave elevation. The various parameters on which the T wave elevation is dependent are then analyzed. Different approaches are used to reduce this T wave elevation problem. The direct approach aims at computing the magnitude of T wave elevation using magneto-hydro-dynamic equations. The indirect approach uses digital signal processing tools like the least mean square adaptive filter to remove the T wave elevation and obtain artifact free ECG signal in the MRI. Excellent results are obtained from the simulation model. The model perfectly simulates the ECG signal in the MRI at all the 12 leads of the ECG. These results are compared with ECG signals measured in the MRI. A simulation package is developed in MATLAB based on the simulation model. This package is a graphical user interface allowing the user to change the strength of magnetic field, the radius of the aorta and the orientation of the aorta with respect to the heart and observe the ECG signals with the elevation at the 12 leads of the ECG. Also the artifacts introduced due to the magnetic field can be removed by the least mean square adaptive filter. The filter adapts the ECG signal in the MRI to the ECG signal of the patient outside the MRI. Before the adaptation, the heart rate of the ECG outside the MRI is matched to the ECG in the MRI by interpolation or decimation. The adaptive filter works excellently to remove the T wave artifacts. When the cardiac output of the patient changes, the simulation model is used along with the adaptive filter to obtain the artifact free ECG signal.
Show less - Date Issued
- 2007
- Identifier
- CFE0001857, ucf:47389
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001857
- Title
- Thermodynamic Modeling and Transient Simulation of a Low-Pressure Heat Recovery Steam Generator Using Siemens T3000.
- Creator
-
Caesar, Andres, Das, Tuhin, Bhattacharya, Samik, Putnam, Shawn, University of Central Florida
- Abstract / Description
-
With world energy consumption rising, and nonrenewable energy resources quickly depleting, it is essential to design more efficient power plants and thereby economically utilize fossil fuels. To that end, this work focuses on the thermodynamic modeling of steam power systems to enhance our understanding of their dynamic and transient behavior. This thesis discusses the physical phenomena behind a heat recovery steam generator (HRSG) and develops a mathematical description of its system...
Show moreWith world energy consumption rising, and nonrenewable energy resources quickly depleting, it is essential to design more efficient power plants and thereby economically utilize fossil fuels. To that end, this work focuses on the thermodynamic modeling of steam power systems to enhance our understanding of their dynamic and transient behavior. This thesis discusses the physical phenomena behind a heat recovery steam generator (HRSG) and develops a mathematical description of its system dynamics. The model is developed from fundamentals of fluid dynamics, phase change, heat transfer, conservation laws and unsteady flow energy equations. The resulting model captures coupled physical phenomena with acceptable accuracy while achieving fast, and potentially real-time, simulations. The computational HRSG model is constructed in the Siemens T3000 platform. This work establishes the dynamic modeling capability of T3000, which has traditionally been used for programming control algorithms. The validation objective of this project is to accurately simulate the transient response of an operational steam power system. Validation of the T3000 model is carried out by comparing simulation results to start-up data from the low-pressure system of a Siemens power plant while maintaining the same inlet conditions. Simulation results well correlate with plant data regarding transient behavior and equilibrium conditions. With a comprehensive HRSG model available, it will allow for further research to take place, and aid in the advancement of steam power system technology. Some future research areas include the extension to intermediate and high-pressure system simulations, combined simulation of all three pressure stages, and continued improvement of the boiler model. In addition to enabling model-based prediction and providing further insight, this effort will also lead to controller design for improved performance.
Show less - Date Issued
- 2018
- Identifier
- CFE0007562, ucf:52599
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007562
- Title
- An Agent Based Model to assess crew temporal variability during U.S. Navy shipboard operations.
- Creator
-
Muhs, Kevin, Karwowski, Waldemar, Elshennawy, Ahmad, Hancock, Peter, Sjoden, Glenn, University of Central Florida
- Abstract / Description
-
Understanding the factors that affect human performance variability as well as their temporal impacts is an essential element in fully integrating and designing complex, adaptive environments. This understanding is particularly necessary for high stakes, time-critical routines such as those performed during nuclear reactor, air traffic control, and military operations. Over the last three decades significant efforts have emerged to demonstrate and apply a host of techniques to include...
Show moreUnderstanding the factors that affect human performance variability as well as their temporal impacts is an essential element in fully integrating and designing complex, adaptive environments. This understanding is particularly necessary for high stakes, time-critical routines such as those performed during nuclear reactor, air traffic control, and military operations. Over the last three decades significant efforts have emerged to demonstrate and apply a host of techniques to include Discrete Event Simulation, Bayesian Belief Networks, Neural Networks, and a multitude of existing software applications to provide relevant assessments of human task performance and temporal variability. The objective of this research was to design and develop a novel Agent Based Modeling and Simulation (ABMS) methodology to generate a timeline of work and assess impacts of crew temporal variability during U.S. Navy Small Boat Defense operations in littoral waters.The developed ABMS methodology included human performance models for six crew members (agents) as well as a threat craft, and incorporated varying levels of crew capability and task support. AnyLogic ABMS software was used to simultaneously provide detailed measures of individual sailor performance and of system-level emergent behavior. This methodology and these models were adapted and built to assure extensibility across a broad range of U.S. Navy shipboard operations.Application of the developed ABMS methodology effectively demonstrated a way to visualize and quantify impacts/uncertainties of human temporal variability on both workload and crew effectiveness during U.S. Navy shipboard operations.
Show less - Date Issued
- 2018
- Identifier
- CFE0007592, ucf:52549
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007592