Current Search: Computer simulation (x)
View All Items
Pages
- Title
- Personal Computer Simulation Program for Step Motor Drive Systems.
- Creator
-
Koos, William M., Harden, Richard C., Engineering
- Abstract / Description
-
University of Central Florida College of Engineering Thesis; A system of equations modeling a class of step motors known as the permanent magnet rotor step motor is presented. The model is implemented on a APPLE personal computer in a version of BASIC. Measurements are then made on an existing motor and input to the program for validation. A special test fixture is utilized to take performance data on the motor to facilitate comparisons with the predictions of the program. The comparisons...
Show moreUniversity of Central Florida College of Engineering Thesis; A system of equations modeling a class of step motors known as the permanent magnet rotor step motor is presented. The model is implemented on a APPLE personal computer in a version of BASIC. Measurements are then made on an existing motor and input to the program for validation. A special test fixture is utilized to take performance data on the motor to facilitate comparisons with the predictions of the program. The comparisons show the model is indeed valid for design of step motor drive systems and emphasize the practical nature of using personal computers and simulations for design
Show less - Date Issued
- 1982
- Identifier
- CFR0008163, ucf:53067
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFR0008163
- Title
- A COMMON COMPONENT-BASED SOFTWARE ARCHITECTURE FOR MILITARY AND COMMERCIAL PC-BASED VIRTUAL SIMULATION.
- Creator
-
Lewis, Joshua, Proctor, Michael, University of Central Florida
- Abstract / Description
-
Commercially available military-themed virtual simulations have been developed and sold for entertainment since the beginning of the personal computing era. There exists an intense interest by various branches of the military to leverage the technological advances of the personal computing and video game industries to provide low cost military training. By nature of the content of the commercial military-themed virtual simulations, a large overlap has grown between the interests, resources,...
Show moreCommercially available military-themed virtual simulations have been developed and sold for entertainment since the beginning of the personal computing era. There exists an intense interest by various branches of the military to leverage the technological advances of the personal computing and video game industries to provide low cost military training. By nature of the content of the commercial military-themed virtual simulations, a large overlap has grown between the interests, resources, standards, and technology of the computer entertainment industry and military training branches. This research attempts to identify these commonalities with the purpose of systematically designing and evaluating a common component-based software architecture that could be used to implement a framework for developing content for both commercial and military virtual simulation software applications.
Show less - Date Issued
- 2006
- Identifier
- CFE0001177, ucf:46868
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001177
- Title
- USING COMPUTER SIMULATION MODELING TO EVALUATE THE BIOTERRORISMRESPONSE PLAN AT A LOCAL HOSPITAL FACILITY.
- Creator
-
Bebber, Robert, Liberman, Aaron, University of Central Florida
- Abstract / Description
-
The terrorist attacks of September 11th, 2001 and the subsequent anthrax mail attack have forced health care administrators and policy makers to place a new emphasis on disaster planning at hospital facilities--specifically bioterrorism planning. Yet how does one truly "prepare" for the unpredictable? In spite of accreditation requirements, which demand hospitals put in to place preparations to deal with bioterrorism events, a recent study from the General Accounting Office (GAO) concluded...
Show moreThe terrorist attacks of September 11th, 2001 and the subsequent anthrax mail attack have forced health care administrators and policy makers to place a new emphasis on disaster planning at hospital facilities--specifically bioterrorism planning. Yet how does one truly "prepare" for the unpredictable? In spite of accreditation requirements, which demand hospitals put in to place preparations to deal with bioterrorism events, a recent study from the General Accounting Office (GAO) concluded that most hospitals are still not capable of dealing with such threats (Gonzalez, 2004). This dissertation uses computer simulation modeling to test the effectiveness of bioterrorism planning at a local hospital facility in Central Florida, Winter Park Memorial Hospital. It is limited to the response plan developed by the hospital's Emergency Department. It evaluates the plan's effectiveness in dealing with an inhalational anthrax attack. Using Arena computer simulation software, and grounded within the theoretical framework of Complexity Science, we were able to test the effectiveness of the response plan in relation to Emergency Department bed capacity. Our results indicated that the response plan's flexibility was able to accommodate an increased patient load due to an attack, including an influx of the "worried well." Topics of future work and study are proposed.
Show less - Date Issued
- 2007
- Identifier
- CFE0001712, ucf:47293
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001712
- Title
- VOICE TRACK COMPUTER BASED SIMULATION FOR MEDICAL TRAINING.
- Creator
-
Makwana, Alpesh, Kincaid, J. Peter, University of Central Florida
- Abstract / Description
-
Varying the delivery rate of audio-based text within web-based training increases the effectiveness of the learning process and improves retention when compared with a fixed audio-based text delivery rate. To answer this question, two groups of 20 participants and one group of 10 participants were tested using the Web-based Anatomy & Physiology course modules developed by Medsn, Inc. The control group received the static speed of 128 words per minute while the experimental group received the...
Show moreVarying the delivery rate of audio-based text within web-based training increases the effectiveness of the learning process and improves retention when compared with a fixed audio-based text delivery rate. To answer this question, two groups of 20 participants and one group of 10 participants were tested using the Web-based Anatomy & Physiology course modules developed by Medsn, Inc. The control group received the static speed of 128 words per minute while the experimental group received the initial speed of 128 words per minute with the option to change the speed of the audio-based text. An additional experimental group received the initial speed of 148 words per minute also having the option to vary the speed of the audio-based text. A three way single variable Analysis of Variance (ANOVA) was utilized to examine speed of voice presentation differences. The results were significant, F (2, 47) = 4.67, p=0.014, ç2 = 0.166. The mean for the control group was (M = 7.2, SD = 1.69) with the experimental groups at, (M = 8.4, SD = 1.31) and with extra groups at (M = 8.6, SD = 1.26).
Show less - Date Issued
- 2005
- Identifier
- CFE0000639, ucf:46533
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000639
- Title
- Conversion and Validation of SIGART Program, a Progressive Traffic Signal Lights System Computer Model.
- Creator
-
Troyan, Dennis F., McEwan, Stuart, Engineering
- Abstract / Description
-
Florida Technological University College of Engineering Thesis
- Date Issued
- 1972
- Identifier
- CFR0011995, ucf:53082
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFR0011995
- Title
- AR Physics: Transforming physics diagrammatic representations on paper into interactive simulations.
- Creator
-
Zhou, Yao, Underberg-Goode, Natalie, Lindgren, Robb, Moshell, Jack, Peters, Philip, University of Central Florida
- Abstract / Description
-
A problem representation is a cognitive structure created by the solver in correspondence to the problem. Sketching representative diagrams in the domain of physics encourages a problem solving strategy that starts from 'envisionment' by which one internally simulates the physical events and predicts outcomes. Research studies also show that sketching representative diagrams improves learner's performance in solving physics problems. The pedagogic benefits of sketching representations on...
Show moreA problem representation is a cognitive structure created by the solver in correspondence to the problem. Sketching representative diagrams in the domain of physics encourages a problem solving strategy that starts from 'envisionment' by which one internally simulates the physical events and predicts outcomes. Research studies also show that sketching representative diagrams improves learner's performance in solving physics problems. The pedagogic benefits of sketching representations on paper make this traditional learning strategy remain pivotal and worthwhile to be preserved and integrated into the current digital learning landscape.In this paper, I describe AR Physics, an Augmented Reality based application that intends to facilitate one's learning of physics concepts about objects' linear motion. It affords the verified physics learning strategy of sketching representative diagrams on paper, and explores the capability of Augmented Reality in enhancing visual conceptions. The application converts the diagrams drawn on paper into virtual representations displayed on a tablet screen. As such learners can create physics simulation based on the diagrams and test their (")envisionment(") for the diagrams. Users' interaction with AR Physics consists of three steps: 1) sketching a diagram on paper; 2) capturing the sketch with a tablet camera to generate a virtual duplication of the diagram on the tablet screen, and 3) placing a physics object and configuring relevant parameters through the application interface to construct a physics simulation.A user study about the efficiency and usability of AR Physics was performed with 12 college students. The students interacted with the application, and completed three tasks relevant to the learning material. They were given eight questions afterwards to examine their post-learning outcome. The same questions were also given prior to the use of the application in order to comparewith the post results. System Usability Scale (SUS) was adopted to assess the application's usability and interviews were conducted to collect subjects' opinions about Augmented Reality in general. The results of the study demonstrate that the application can effectively facilitate subjects' understanding the target physics concepts. The overall satisfaction with the application's usability was disclosed by the SUS score. Finally subjects expressed that they gained a clearer idea about Augmented Reality through the use of the application.
Show less - Date Issued
- 2014
- Identifier
- CFE0005566, ucf:50292
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005566
- Title
- PHYSICALLY-BASED VISUALIZATION OF RESIDENTIAL BUILDING DAMAGE PROCESS IN HURRICANE.
- Creator
-
Liao, Dezhi, Kincaid, J. Peter, University of Central Florida
- Abstract / Description
-
This research provides realistic techniques to visualize the process of damage to residential building caused by hurricane force winds. Three methods are implemented to make the visualization useful for educating the public about mitigation measures for their homes. First, the underline physics uses Quick Collision Response Calculation. This is an iterative method, which can tune the accuracy and the performance to calculate collision response between building components. Secondly, the damage...
Show moreThis research provides realistic techniques to visualize the process of damage to residential building caused by hurricane force winds. Three methods are implemented to make the visualization useful for educating the public about mitigation measures for their homes. First, the underline physics uses Quick Collision Response Calculation. This is an iterative method, which can tune the accuracy and the performance to calculate collision response between building components. Secondly, the damage process is designed as a Time-scalable Process. By attaching a damage time tag for each building component, the visualization process is treated as a geometry animation allowing users to navigate in the visualization. The detached building components move in response to the wind force that is calculated using qualitative rather than quantitative techniques. The results are acceptable for instructional systems but not for engineering analysis. Quick Damage Prediction is achieved by using a database query instead of using a Monte-Carlo simulation. The database is based on HAZUS® engineering analysis data which gives it validity. A reasoning mechanism based on the definition of the overall building damage in HAZUS® is used to determine the damage state of selected building components including roof cover, roof sheathing, wall, openings and roof-wall connections. Exposure settings of environmental aspects of the simulated environment, such as ocean, trees, cloud and rain are integrated into a scene-graph based graphics engine. Based on the graphics engine and the physics engine, a procedural modeling method is used to efficiently render residential buildings. The resulting program, Hurricane!, is an instructional program for public education useful in schools and museum exhibits.
Show less - Date Issued
- 2007
- Identifier
- CFE0001609, ucf:47190
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001609
- Title
- AN IMPROVED THERMOREGULATORY MODEL FOR COOLING GARMENT APPLICATIONS WITH TRANSIENT METABOLIC RATES.
- Creator
-
Westin, Johan, Kapat, Jayanta, University of Central Florida
- Abstract / Description
-
Current state-of-the-art thermoregulatory models do not predict body temperatures with the accuracies that are required for the development of automatic cooling control in liquid cooling garment (LCG) systems. Automatic cooling control would be beneficial in a variety of space, aviation, military, and industrial environments for optimizing cooling efficiency, for making LCGs as portable and practical as possible, for alleviating the individual from manual cooling control, and for improving...
Show moreCurrent state-of-the-art thermoregulatory models do not predict body temperatures with the accuracies that are required for the development of automatic cooling control in liquid cooling garment (LCG) systems. Automatic cooling control would be beneficial in a variety of space, aviation, military, and industrial environments for optimizing cooling efficiency, for making LCGs as portable and practical as possible, for alleviating the individual from manual cooling control, and for improving thermal comfort and cognitive performance. In this study, we adopt the Fiala thermoregulatory model, which has previously demonstrated state-of-the-art predictive abilities in air environments, for use in LCG environments. We validate the numerical formulation with analytical solutions to the bioheat equation, and find our model to be accurate and stable with a variety of different grid configurations. We then compare the thermoregulatory model's tissue temperature predictions with experimental data where individuals, equipped with an LCG, exercise according to a 700 W rectangular type activity schedule. The root mean square (RMS) deviation between the model response and the mean experimental group response is 0.16°C for the rectal temperature and 0.70°C for the mean skin temperature, which is within state-of-the-art variations. However, with a mean absolute body heat storage error (e_BHS_mean) of 9.7 W·h, the model fails to satisfy the ±6.5 W·h accuracy that is required for the automatic LCG cooling control development. In order to improve model predictions, we modify the blood flow dynamics of the thermoregulatory model. Instead of using step responses to changing requirements, we introduce exponential responses to the muscle blood flow and the vasoconstriction command. We find that such modifications have an insignificant effect on temperature predictions. However, a new vasoconstriction dependency, i.e. the rate of change of hypothalamus temperature weighted by the hypothalamus error signal (DThy·dThy/dt), proves to be an important signal that governs the thermoregulatory response during conditions of simultaneously increasing core and decreasing skin temperatures, which is a common scenario in LCG environments. With the new DThy·dThy/dt dependency in the vasoconstriction command, the e_BHS_mean for the exercise period is reduced by 59% (from 12.9 W·h to 5.2 W·h). Even though the new e_BHS_mean of 5.8 W·h for the total activity schedule is within the target accuracy of ±6.5 W·h, e_BHS fails to stay within the target accuracy during the entire activity schedule. With additional improvements to the central blood pool formulation, the LCG boundary condition, and the agreement between model set-points and actual experimental initial conditions, it seems possible to achieve the strict accuracy that is needed for automatic cooling control development.
Show less - Date Issued
- 2008
- Identifier
- CFE0002460, ucf:47707
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0002460
- Title
- MATHEMATICAL MODELING OF SMALLPOX WITHOPTIMAL INTERVENTION POLICY.
- Creator
-
LAWOT, NIWAS, ROLLINS, DAVID, University of Central Florida
- Abstract / Description
-
In this work, two differential equation models for smallpox are numerically solved to find the optimal intervention policy. In each model we look for the range of values of the parameters that give rise to the worst case scenarios. Since the scale of an epidemic is determined by the number of people infected, and eventually dead, as a result of infection, we attempt to quantify the scale of the epidemic and recommend the optimum intervention policy. In the first case study, we mimic a densely...
Show moreIn this work, two differential equation models for smallpox are numerically solved to find the optimal intervention policy. In each model we look for the range of values of the parameters that give rise to the worst case scenarios. Since the scale of an epidemic is determined by the number of people infected, and eventually dead, as a result of infection, we attempt to quantify the scale of the epidemic and recommend the optimum intervention policy. In the first case study, we mimic a densely populated city with comparatively big tourist population, and heavily used mass transportation system. A mathematical model for the transmission of smallpox is formulated, and numerically solved. In the second case study, we incorporate five different stages of infection: (1) susceptible (2) infected but asymptomatic, non infectious, and vaccine-sensitive; (3) infected but asymptomatic, noninfectious, and vaccine-in-sensitive; (4) infected but asymptomatic, and infectious; and (5) symptomatic and isolated. Exponential probability distribution is used for modeling this case. We compare outcomes of mass vaccination and trace vaccination on the final size of the epidemic.
Show less - Date Issued
- 2006
- Identifier
- CFE0001193, ucf:46848
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001193
- Title
- TOWARD INCREASING PERFORMANCE AND EFFICIENCY IN GAS TURBINES FOR POWER GENERATION AND AERO-PROPULSION: UNSTEADY SIMULATION OF ANGLED DISCRETE-INJECTION COOLANT IN A HOT GAS PATH CROSSFLOW.
- Creator
-
Johnson, Perry, Kapat, Jayanta, University of Central Florida
- Abstract / Description
-
This thesis describes the numerical predictions of turbine film cooling interactions using Large Eddy Simulations. In most engineering industrial applications, the Reynolds-Averaged Navier-Stokes equations, usually paired with two-equation models such as k-[epsilon] or k-[omega], are utilized as an inexpensive method for modeling complex turbulent flows. By resolving the larger, more influential scale of turbulent eddies, the Large Eddy Simulation has been shown to yield a significant...
Show moreThis thesis describes the numerical predictions of turbine film cooling interactions using Large Eddy Simulations. In most engineering industrial applications, the Reynolds-Averaged Navier-Stokes equations, usually paired with two-equation models such as k-[epsilon] or k-[omega], are utilized as an inexpensive method for modeling complex turbulent flows. By resolving the larger, more influential scale of turbulent eddies, the Large Eddy Simulation has been shown to yield a significant increase in accuracy over traditional two-equation RANS models for many engineering flows. In addition, Large Eddy Simulations provide insight into the unsteady characteristics and coherent vortex structures of turbulent flows. Discrete hole film cooling is a jet-in-cross-flow phenomenon, which is known to produce complex turbulent interactions and vortex structures. For this reason, the present study investigates the influence of these jet-crossflow interactions in a time-resolved unsteady simulation. Because of the broad spectrum of length scales present in moderate and high Reynolds number flows, such as the present topic, the high computational cost of Direct Numerical Simulation was excluded from possibility.
Show less - Date Issued
- 2011
- Identifier
- CFH0004086, ucf:44798
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFH0004086
- Title
- Modeling Transport and Protein Adsorption in Microfluidic Systems.
- Creator
-
Finch, Craig, Hickman, James, Kincaid, John, Lin, Kuo-Chi, Behal, Aman, Cho, Hyoung, University of Central Florida
- Abstract / Description
-
This work describes theoretical advances in the modeling and simulation of microfluidic systems and demonstrates the practical application of those techniques. A new multi-scale model of the adsorption of hard spheres was formulated to bridge the gap between simulations of discrete particles and continuum fluid dynamics. A whispering gallery mode (WGM) biosensor was constructed and used to measure the kinetics of adsorption for two types of proteins on four different surfaces. Computational...
Show moreThis work describes theoretical advances in the modeling and simulation of microfluidic systems and demonstrates the practical application of those techniques. A new multi-scale model of the adsorption of hard spheres was formulated to bridge the gap between simulations of discrete particles and continuum fluid dynamics. A whispering gallery mode (WGM) biosensor was constructed and used to measure the kinetics of adsorption for two types of proteins on four different surfaces. Computational fluid dynamics was used to analyze the transport of proteins in the flow cell of the biosensor. Kinetic models of protein adsorption that take transport limitations into account were fitted to the experimental data and used to draw conclusions about the mechanisms of adsorption. Transport simulations were then applied to the practical problem of optimizing the design of a microfluidic bioreactor to enable (")plugs(") of fluid to flow from one chamber to the next with minimal dispersion. Experiments were used to validate the transport simulations. The combination of quantitative modeling and simulation and experiments led to results that could not have been achieved using either approach by itself. Simulation tools that accurately predict transport and protein adsorption will enable the rational design of microfluidic devices for biomedical applications.
Show less - Date Issued
- 2011
- Identifier
- CFE0004474, ucf:49313
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004474
- Title
- Modeling of flow generated sound in a constricted duct at low Mach number.
- Creator
-
Thibbotuwawa Gamage, Peshala, Mansy, Hansen, Kassab, Alain, Bhattacharya, Samik, University of Central Florida
- Abstract / Description
-
Modelling flow and acoustics in a constricted duct at low Mach numbers is important for investigating many physiological phenomena such as phonation, generation of arterial murmurs, and pulmonary conditions involving airway obstruction. The objective of this study is to validate computational fluid dynamics (CFD) and computational aero-acoustics (CAA) simulations in a constricted tube at low Mach numbers. Different turbulence models were employed to simulate the flow field. Models included...
Show moreModelling flow and acoustics in a constricted duct at low Mach numbers is important for investigating many physiological phenomena such as phonation, generation of arterial murmurs, and pulmonary conditions involving airway obstruction. The objective of this study is to validate computational fluid dynamics (CFD) and computational aero-acoustics (CAA) simulations in a constricted tube at low Mach numbers. Different turbulence models were employed to simulate the flow field. Models included Reynolds Average Navier-Stokes (RANS), Detached eddy simulation (DES) and Large eddy simulation (LES). The models were validated by comparing study results with laser doppler anemometry (LDA) velocity measurements. The comparison showed that experimental data agreed best with the LES model results. Although RANS Reynolds stress transport (RST) model showed good agreement with mean velocity measurements, it was unable to capture velocity fluctuations. RANS shear stress transport (SST) k-? model and DES models were unable to predict the location of high fluctuating flow region accurately.CAA simulation was performed in parallel with LES using Acoustic Perturbation Equation (APE) based hybrid CAA method. CAA simulation results agreed well with measured wall sound pressure spectra. The APE acoustic sources were found in jet core breakdown region downstream of the constriction, which was also characterized by high flow fluctuations. Proper Orthogonal Decomposition (POD) is used to study the coherent flow structures at the different frequencies corresponding to the peaks of the measured sound pressure spectra. The study results will help enhance our understanding of sound generation mechanisms in constricted tubes including biomedical applications.
Show less - Date Issued
- 2017
- Identifier
- CFE0006920, ucf:51696
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006920
- Title
- TRAFFIC CONFLICT ANALYSIS UNDER FOG CONDITIONS USING COMPUTER SIMULATION.
- Creator
-
Zhang, Binya, Radwan, Essam, Abdel-Aty, Mohamed, Abou-Senna, Hatem, University of Central Florida
- Abstract / Description
-
The weather condition is a crucial influence factor on road safety issues. Fog is one of the most noticeable weather conditions, which has a significant impact on traffic safety. Such condition reduces the road's visibility and consequently can affect drivers' vision, perception, and judgments. The statistical data shows that many crashes are directly or indirectly caused by the low-visibility weather condition. Hence, it is necessary for road traffic engineers to study the relationship of...
Show moreThe weather condition is a crucial influence factor on road safety issues. Fog is one of the most noticeable weather conditions, which has a significant impact on traffic safety. Such condition reduces the road's visibility and consequently can affect drivers' vision, perception, and judgments. The statistical data shows that many crashes are directly or indirectly caused by the low-visibility weather condition. Hence, it is necessary for road traffic engineers to study the relationship of road traffic accidents and their influence factors. Among these factors, the traffic volume and the speed limits in poor visibility areas are the primary reasons that can affect the types and occurring locations of road accidents.In this thesis, microscopic traffic simulation, through the use of VISSIM software, was used to study the road safety issue and its influencing factors due to limited visibility. A basic simulation model was built based on previously collected field data to simulate Interstate 4 (I-4)'s environment, geometry characteristics, and the basic traffic volume composition conditions. On the foundation of the basic simulation model, an experimental model was built to study the conflicts' types and distribution places under several different scenarios. Taking into consideration the entire 4-mile study area on I-4, this area was divided into 3 segments: section 1 with clear visibility, fog area of low visibility, and section 2 with clear visibility. Lower speed limits in the fog area, which were less than the limits in no-fog areas, were set to investigate the different speed limits' influence on the two main types of traffic conflicts: lane-change conflicts and rear-end conflicts. The experimental model generated several groups of traffic trajectory data files. The vehicle conflicts data were stored in these trajectory data files which, contains the conflict locations' coordinates, conflict time, time-to-conflict, and post-encroachment-time among other measures. The Surrogate Safety Assessment Model (SSAM), developed by the Federal Highway Administration, was applied to analyze these conflict data.From the analysis results, it is found that the traffic volume is an important factor, which has a large effect on the number of conflicts. The number of lane-change and rear-end conflicts increases along with the traffic volume growth. Another finding is that the difference between the speed limits in the fog area and in the no-fog areas is another significant factor that impacts the conflicts' frequency. Larger difference between the speed limits in two nearing road sections always leads to more accidents due to the inadequate reaction time for vehicle drivers to brake in time. And comparing to the scenarios that with the reduced speed limits in the low visibility zone, the condition that without the reduced speed limit has higher conflict number, which indicates that the it is necessary to put a lower speed limit in the fog zone which has a lower visibility. The results of this research have a certain reference value for studying the relationship between the road traffic conflicts and the impacts of different speed limits under fog condition. Overall, the findings of this research suggest follow up studies to further investigate possible relationships between conflicts as observed by simulation models and reported crashes in fog areas.
Show less - Date Issued
- 2015
- Identifier
- CFE0005747, ucf:50104
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005747
- Title
- A unified approach to dynamic modeling of high switching frequency pwm converters.
- Creator
-
Iannello, Christopher J., Batarseh, Issa, Engineering
- Abstract / Description
-
University of Central Florida College of Engineering Thesis; This dissertation will present the development of a unified approach for dynamic modeling of the PWM and soft-switching power converters. Dynamic modeling of non-linear power converters is very important for the design and stability of their closed loop control. While the use of equivalent circuits is often preferred due to simulation efficiency issues, no unified and widely applicable method for the formulation of these equivalents...
Show moreUniversity of Central Florida College of Engineering Thesis; This dissertation will present the development of a unified approach for dynamic modeling of the PWM and soft-switching power converters. Dynamic modeling of non-linear power converters is very important for the design and stability of their closed loop control. While the use of equivalent circuits is often preferred due to simulation efficiency issues, no unified and widely applicable method for the formulation of these equivalents exists.
Show less - Date Issued
- 2001
- Identifier
- CFR0000833, ucf:52929
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFR0000833
- Title
- Information Propagation Algorithms for Consensus Formation in Decentralized Multi-Agent Systems.
- Creator
-
Hollander, Christopher, Wu, Annie, Shumaker, Randall, Wiegand, Rudolf, Turgut, Damla, Song, Zixia, University of Central Florida
- Abstract / Description
-
Consensus occurs within a multi-agent system when every agent is in agreement about the value of some particular state. For example, the color of an LED, the position or magnitude of a vector, a rendezvous location, the most recent state of data within a database, or the identity of a leader are all states that agents might need to agree on in order to execute their tasking.The task of the decentralized consensus problem for multi-agent systems is to design an algorithm that enables agents to...
Show moreConsensus occurs within a multi-agent system when every agent is in agreement about the value of some particular state. For example, the color of an LED, the position or magnitude of a vector, a rendezvous location, the most recent state of data within a database, or the identity of a leader are all states that agents might need to agree on in order to execute their tasking.The task of the decentralized consensus problem for multi-agent systems is to design an algorithm that enables agents to communicate and exchange information such that, in finite time, agents are able to form a consensus without the use of a centralized control mechanism. The primary goal of this research is to introduce and provide supporting evidence for Stochastic Local Observation/Gossip (SLOG) algorithms as a new class of solutions to the decentralized consensus problem for multi-agent systems that lack a centralized controller, with the additional constraints that agents act asynchronously, information is discrete, and all consensus options are equally preferable to all agents. Examples of where these constraints might apply include the spread of social norms and conventions in artificial populations, rendezvous among a set of specific locations, and task assignment.This goal is achieved through a combination of theory and experimentation. Information propagation process and an information propagation algorithm are derived by unifying the general structure of multiple existing solutions to the decentralized consensus problem. They are then used to define two classes of algorithms that spread information across a network and solve the decentralized consensus problem: buffered gossip algorithms and local observation algorithms. Buffered gossip algorithms generalize the behavior of many push-based solutions to the decentralized consensus problem. Local observation algorithms generalize the behavior of many pull-based solutions to the decentralized consensus problem. In the language of object oriented design, buffered gossip algorithms and local observation algorithms are abstract classes; information propagation processes are interfaces. SLOG algorithms combine the transmission mechanisms of buffered gossip algorithms and local observation algorithms into a single "hybrid" algorithm that is able to push and pull information within the local neighborhood. A common mathematical framework is constructed and used to determine the conditions under which each of these algorithms are guaranteed to produce a consensus, and thus solve the decentralized consensus problem. Finally, a series of simulation experiments are conducted to study the performance of SLOG algorithms. These experiments compare the average speed of consensus formation between buffered gossip algorithms, local observation algorithms, and SLOG algorithms over four distinct network topologies.Beyond the introduction of the SLOG algorithm, this research also contributes to the existing literature on the decentralized consensus problem by: specifying a theoretical framework that can be used to explore the consensus behavior of push-based and pull-based information propagation algorithms; using this framework to define buffered gossip algorithms and local observation algorithms as generalizations for existing solutions to the decentralized consensus problem; highlighting the similarities between consensus algorithms within control theory and opinion dynamics within computational sociology, and showing how these research areas can be successfully combined to create new and powerful algorithms; and providing an empirical comparison between multiple information propagation algorithms.
Show less - Date Issued
- 2015
- Identifier
- CFE0005629, ucf:50229
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005629
- Title
- Simulation, Analysis, and Optimization of Heterogeneous CPU-GPU Systems.
- Creator
-
Giles, Christopher, Heinrich, Mark, Ewetz, Rickard, Lin, Mingjie, Pattanaik, Sumanta, Flitsiyan, Elena, University of Central Florida
- Abstract / Description
-
With the computing industry's recent adoption of the Heterogeneous System Architecture (HSA) standard, we have seen a rapid change in heterogeneous CPU-GPU processor designs. State-of-the-art heterogeneous CPU-GPU processors tightly integrate multicore CPUs and multi-compute unit GPUs together on a single die. This brings the MIMD processing capabilities of the CPU and the SIMD processing capabilities of the GPU together into a single cohesive package with new HSA features comprising better...
Show moreWith the computing industry's recent adoption of the Heterogeneous System Architecture (HSA) standard, we have seen a rapid change in heterogeneous CPU-GPU processor designs. State-of-the-art heterogeneous CPU-GPU processors tightly integrate multicore CPUs and multi-compute unit GPUs together on a single die. This brings the MIMD processing capabilities of the CPU and the SIMD processing capabilities of the GPU together into a single cohesive package with new HSA features comprising better programmability, coherency between the CPU and GPU, shared Last Level Cache (LLC), and shared virtual memory address spaces. These advancements can potentially bring marked gains in heterogeneous processor performance and have piqued the interest of researchers who wish to unlock these potential performance gains. Therefore, in this dissertation I explore the heterogeneous CPU-GPU processor and application design space with the goal of answering interesting research questions, such as, (1) what are the architectural design trade-offs in heterogeneous CPU-GPU processors and (2) how do we best maximize heterogeneous CPU-GPU application performance on a given system. To enable my exploration of the heterogeneous CPU-GPU design space, I introduce a novel discrete event-driven simulation library called KnightSim and a novel computer architectural simulator called M2S-CGM. M2S-CGM includes all of the simulation elements necessary to simulate coherent execution between a CPU and GPU with shared LLC and shared virtual memory address spaces. I then utilize M2S-CGM for the conduct of three architectural studies. First, I study the architectural effects of shared LLC and CPU-GPU coherence on the overall performance of non-collaborative GPU-only applications. Second, I profile and analyze a set of collaborative CPU-GPU applications to determine how to best optimize them for maximum collaborative performance. Third, I study the impact of varying four key architectural parameters on collaborative CPU-GPU performance by varying GPU compute unit coalesce size, GPU to memory controller bandwidth, GPU frequency, and system wide switching fabric latency.
Show less - Date Issued
- 2019
- Identifier
- CFE0007807, ucf:52346
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007807
- Title
- EFFECT OF OPERATOR CONTROL CONFIGURATION ON UNMANNED AERIAL SYSTEM TRAINABILITY.
- Creator
-
Neumann, John, Kincaid, Peter, University of Central Florida
- Abstract / Description
-
Unmanned aerial systems (UAS) carry no pilot on board, yet they still require live operators to handle critical functions such as mission planning and execution. Humans also interpret the sensor information provided by these platforms. This applies to all classes of unmanned aerial vehicles (UAV's), including the smaller portable systems used for gathering real-time reconnaissance during military operations in urban terrain. The need to quickly and reliably train soldiers to control small...
Show moreUnmanned aerial systems (UAS) carry no pilot on board, yet they still require live operators to handle critical functions such as mission planning and execution. Humans also interpret the sensor information provided by these platforms. This applies to all classes of unmanned aerial vehicles (UAV's), including the smaller portable systems used for gathering real-time reconnaissance during military operations in urban terrain. The need to quickly and reliably train soldiers to control small UAS operations demands that the human-system interface be intuitive and easy to master. In this study, participants completed a series of tests of spatial ability and were then trained (in simulation) to teleoperate a micro-unmanned aerial vehicle equipped with forward and downward fixed cameras. Three aspects of the human-system interface were manipulated to assess the effects on manual control mastery and target detection. One factor was the input device. Participants used either a mouse or a specially programmed game controller (similar to that used with the Sony Playstation 2 video game console). A second factor was the nature of the flight control displays as either continuous or discrete (analog v. digital). The third factor involved the presentation of sensor imagery. The display could either provide streaming video from one camera at a time, or present the imagery from both cameras simultaneously in separate windows. The primary dependent variables included: 1) time to complete assigned missions, 2) number of collisions, 3) number of targets detected, and 4) operator workload. In general, operator performance was better with the game controller than with the mouse, but significant improvement in time to complete occurred over repeated trials regardless of the device used. Time to complete missions was significantly faster with the game controller, and operators also detected more targets without any significant differences in workload compared to mouse users. Workload on repeated trials decreased with practice, and spatial ability was a significant covariate of workload. Lower spatial ability associated with higher workload scores. In addition, demographic data including computer usage and video gaming experience were collected and analyzed, and correlated with performance. Higher video gaming experience was also associated with lower workload.
Show less - Date Issued
- 2006
- Identifier
- CFE0001496, ucf:47080
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001496
- Title
- RESOURCE-CONSTRAINT AND SCALABLE DATA DISTRIBUTION MANAGEMENT FOR HIGH LEVEL ARCHITECTURE.
- Creator
-
Gupta, Pankaj, Guha, Ratan, University of Central Florida
- Abstract / Description
-
In this dissertation, we present an efficient algorithm, called P-Pruning algorithm, for data distribution management problem in High Level Architecture. High Level Architecture (HLA) presents a framework for modeling and simulation within the Department of Defense (DoD) and forms the basis of IEEE 1516 standard. The goal of this architecture is to interoperate multiple simulations and facilitate the reuse of simulation components. Data Distribution Management (DDM) is one of the six...
Show moreIn this dissertation, we present an efficient algorithm, called P-Pruning algorithm, for data distribution management problem in High Level Architecture. High Level Architecture (HLA) presents a framework for modeling and simulation within the Department of Defense (DoD) and forms the basis of IEEE 1516 standard. The goal of this architecture is to interoperate multiple simulations and facilitate the reuse of simulation components. Data Distribution Management (DDM) is one of the six components in HLA that is responsible for limiting and controlling the data exchanged in a simulation and reducing the processing requirements of federates. DDM is also an important problem in the parallel and distributed computing domain, especially in large-scale distributed modeling and simulation applications, where control on data exchange among the simulated entities is required. We present a performance-evaluation simulation study of the P-Pruning algorithm against three techniques: region-matching, fixed-grid, and dynamic-grid DDM algorithms. The P-Pruning algorithm is faster than region-matching, fixed-grid, and dynamic-grid DDM algorithms as it avoid the quadratic computation step involved in other algorithms. The simulation results show that the P-Pruning DDM algorithm uses memory at run-time more efficiently and requires less number of multicast groups as compared to the three algorithms. To increase the scalability of P-Pruning algorithm, we develop a resource-efficient enhancement for the P-Pruning algorithm. We also present a performance evaluation study of this resource-efficient algorithm in a memory-constraint environment. The Memory-Constraint P-Pruning algorithm deploys I/O efficient data-structures for optimized memory access at run-time. The simulation results show that the Memory-Constraint P-Pruning DDM algorithm is faster than the P-Pruning algorithm and utilizes memory at run-time more efficiently. It is suitable for high performance distributed simulation applications as it improves the scalability of the P-Pruning algorithm by several order in terms of number of federates. We analyze the computation complexity of the P-Pruning algorithm using average-case analysis. We have also extended the P-Pruning algorithm to three-dimensional routing space. In addition, we present the P-Pruning algorithm for dynamic conditions where the distribution of federated is changing at run-time. The dynamic P-Pruning algorithm investigates the changes among federates regions and rebuilds all the affected multicast groups. We have also integrated the P-Pruning algorithm with FDK, an implementation of the HLA architecture. The integration involves the design and implementation of the communicator module for mapping federate interest regions. We provide a modular overview of P-Pruning algorithm components and describe the functional flow for creating multicast groups during simulation. We investigate the deficiencies in DDM implementation under FDK and suggest an approach to overcome them using P-Pruning algorithm. We have enhanced FDK from its existing HLA 1.3 specification by using IEEE 1516 standard for DDM implementation. We provide the system setup instructions and communication routines for running the integrated on a network of machines. We also describe implementation details involved in integration of P-Pruning algorithm with FDK and provide results of our experiences.
Show less - Date Issued
- 2007
- Identifier
- CFE0001949, ucf:47447
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001949
- Title
- SIMULATION FOR COMMERCIAL DRIVER LICENSE THIRD PARTY TESTER TESTING.
- Creator
-
Truong, Henry, Lin, Kurt, University of Central Florida
- Abstract / Description
-
The advance of technology is thought to help ease the myriad tasks that are usually involved in operating equipment. Training and testing in modern times have been replacing with simulation technologies that mimic the actual live operations and testing. Many successful stories of flight simulation come from military fighter aircraft and commercial pilot programs. The possibilities of safety in saving lives, economic incentive in reducing the operational cost and reducing the carbon footprint...
Show moreThe advance of technology is thought to help ease the myriad tasks that are usually involved in operating equipment. Training and testing in modern times have been replacing with simulation technologies that mimic the actual live operations and testing. Many successful stories of flight simulation come from military fighter aircraft and commercial pilot programs. The possibilities of safety in saving lives, economic incentive in reducing the operational cost and reducing the carbon footprint via simulation makes simulation worth looking into. These considerations quickly boosted the transfer from live training operations to virtual and simulation, as were easily adopted in the history of flight training and testing. Although, there has been a lack of application, the benefits of the computer based simulation as a modeling and simulation (M&S) tool can be applied to the commercial driver license (CDL) for the trucking industry. Nevertheless, this is an uphill battle to convince CDL administrators to integrate modern technology into the CDL program instead of using the traditional daily business of manual testing. This is because the cost of trucking industry live operations is still relatively affordable; individuals and companies are reluctant to adopt the use of the modeling and simulation driving or testing system. Fortunately, cost is not the only variable to consider for the training and testing administrators and their management. There is a need to expand the use of technology to support live operations. The safety of the student, trainer, and tester should be taken into account. The availability of training or testing scenarios is also an influencing factor. Ultimately, the most important factor is driving safety on the American road. The relationship of accidents with driver license fraud has led the Federal Department of Transportation to want to reduce fraud in third-party Commercial Driver License (CDL) administration. Although it is not a perfect solution that can fix all, the utilization of simulation technologies for driving assessment could be a solution to help reduce fraud if it is applied correctly. The Department of Transportation (DOT) authorized the statesÃÂ' independent authority to administrate the local CDL including the use of the Third-Party Tester (TPT). As a result, some criminal activities prompted the Federal investigation to recommend changes and to fund the states to take action to stay in compliance with the Federal regulation. This is the opportunity for the state CDL administrator to explore the use of M&S to support its mission. Recall, those arguments for the use of the M&S is the thought of safety in saving lives, economic incentive in reducing the operational cost, and reducing the carbon footprint via using simulation. This makes simulation a viable resource. This paper will report the research study of using the computer based testing modeling and simulation tools to replace or augment the current state examiner as means of assessing the CDL TPT proficiency in basic backing skills. This pilot study of this system has several aspects to address. The scenarios must be relevant to test the knowledge of the TPT by using closely comparable scenarios to the current manual testing method. The scenario-based simulation should incorporate randomness to provide a greater sense of reality. In addition, the reconfigurable built-in random behavior scenarios provide the administrator greater control of behaviors and allow the administrator to be able to select among the random scenarios. Finally, the paper will present the data sampling from relevant participants of the CDL TPT and methodology applied. The analysis of data presents in this research study will be valuable for the State and Federal CDL administrator to consider the pros and cons of applying or adding a computer based simulation to their current testing methodology.
Show less - Date Issued
- 2010
- Identifier
- CFE0003222, ucf:48577
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0003222
- Title
- Network Partitioning in Distributed Agent-Based Models.
- Creator
-
Petkova, Antoniya, Deo, Narsingh, Hughes, Charles, Bassiouni, Mostafa, Shaykhian, Gholam, University of Central Florida
- Abstract / Description
-
Agent-Based Models (ABMs) are an emerging simulation paradigm for modeling complex systems, comprised of autonomous, possibly heterogeneous, interacting agents. The utility of ABMs lies in their ability to represent such complex systems as self-organizing networks of agents. Modeling and understanding the behavior of complex systems usually occurs at large and representative scales, and often obtaining and visualizing of simulation results in real-time is critical.The real-time requirement...
Show moreAgent-Based Models (ABMs) are an emerging simulation paradigm for modeling complex systems, comprised of autonomous, possibly heterogeneous, interacting agents. The utility of ABMs lies in their ability to represent such complex systems as self-organizing networks of agents. Modeling and understanding the behavior of complex systems usually occurs at large and representative scales, and often obtaining and visualizing of simulation results in real-time is critical.The real-time requirement necessitates the use of in-memory computing, as it is dif?cult and challenging to handle the latency and unpredictability of disk accesses. Combining this observation with the scale requirement emphasizes the need to use parallel and distributed computing platforms, such as MPI-enabled CPU clusters. Consequently, the agent population must be "partitioned" across different CPUs in a cluster. Further, the typically high volume of interactions among agents can quickly become a signi?cant bottleneck for real-time or large-scale simulations. The problem is exacerbated if the underlying ABM network is dynamic and the inter-process communication evolves over the course of the simulation. Therefore, it is critical to develop topology-aware partitioning mechanisms to support such large simulations.In this dissertation, we demonstrate that distributed agent-based model simulations bene?t from the use of graph partitioning algorithms that involve a local, neighborhood-based perspective. Such methods do not rely on global accesses to the network and thus are more scalable. In addition, we propose two partitioning schemes that consider the bottom-up individual-centric nature of agent-based modeling. The ?rst technique utilizes label-propagation community detection to partition the dynamic agent network of an ABM. We propose a latency-hiding, seamless integration of community detection in the dynamics of a distributed ABM. To achieve this integration, we exploit the similarity in the process flow patterns of a label-propagation community-detection algorithm and self-organizing ABMs.In the second partitioning scheme, we apply a combination of the Guided Local Search (GLS) and Fast Local Search (FLS) metaheuristics in the context of graph partitioning. The main driving principle of GLS is the dynamic modi?cation of the objective function to escape local optima. The algorithm augments the objective of a local search, thereby transforming the landscape structure and escaping a local optimum. FLS is a local search heuristic algorithm that is aimed at reducing the search space of the main search algorithm. It breaks down the space into sub-neighborhoods such that inactive sub-neighborhoods are removed from the search process. The combination of GLS and FLS allowed us to design a graph partitioning algorithm that is both scalable and sensitive to the inherent modularity of real-world networks.
Show less - Date Issued
- 2017
- Identifier
- CFE0006903, ucf:51706
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006903