Current Search: simulation (x)
View All Items
Pages
- Title
- ANALYSIS AND SIMULATION TOOLS FOR SOLAR ARRAY POWER SYSTEMS.
- Creator
-
Pongratananukul, Nattorn, Kasparis, Takis, University of Central Florida
- Abstract / Description
-
This dissertation presents simulation tools developed specifically for the design of solar array power systems. Contributions are made in several aspects of the system design phases, including solar source modeling, system simulation, and controller verification. A tool to automate the study of solar array configurations using general purpose circuit simulators has been developed based on the modeling of individual solar cells. Hierarchical structure of solar cell elements, including...
Show moreThis dissertation presents simulation tools developed specifically for the design of solar array power systems. Contributions are made in several aspects of the system design phases, including solar source modeling, system simulation, and controller verification. A tool to automate the study of solar array configurations using general purpose circuit simulators has been developed based on the modeling of individual solar cells. Hierarchical structure of solar cell elements, including semiconductor properties, allows simulation of electrical properties as well as the evaluation of the impact of environmental conditions. A second developed tool provides a co-simulation platform with the capability to verify the performance of an actual digital controller implemented in programmable hardware such as a DSP processor, while the entire solar array including the DC-DC power converter is modeled in software algorithms running on a computer. This "virtual plant" allows developing and debugging code for the digital controller, and also to improve the control algorithm. One important task in solar arrays is to track the maximum power point on the array in order to maximize the power that can be delivered. Digital controllers implemented with programmable processors are particularly attractive for this task because sophisticated tracking algorithms can be implemented and revised when needed to optimize their performance. The proposed co-simulation tools are thus very valuable in developing and optimizing the control algorithm, before the system is built. Examples that demonstrate the effectiveness of the proposed methodologies are presented. The proposed simulation tools are also valuable in the design of multi-channel arrays. In the specific system that we have designed and tested, the control algorithm is implemented on a single digital signal processor. In each of the channels the maximum power point is tracked individually. In the prototype we built, off-the-shelf commercial DC-DC converters were utilized. At the end, the overall performance of the entire system was evaluated using solar array simulators capable of simulating various I-V characteristics, and also by using an electronic load. Experimental results are presented.
Show less - Date Issued
- 2005
- Identifier
- CFE0000331, ucf:46290
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000331
- Title
- A FRAMEWORK TO MODEL COMPLEX SYSTEMS VIA DISTRIBUTED SIMULATION A CASE STUDY OF THE VIRTUAL TEST BED SIMULATION SYSTEM USING THE HIGH LEVEL ARCHITECTURE.
- Creator
-
Park, Jaebok, Sepulveda, Jose, University of Central Florida
- Abstract / Description
-
As the size, complexity, and functionality of systems we need to model and simulate con-tinue to increase, benefits such as interoperability and reusability enabled by distributed discrete-event simulation are becoming extremely important in many disciplines, not only military but also many engineering disciplines such as distributed manufacturing, supply chain management, and enterprise engineering, etc. In this dissertation we propose a distributed simulation framework for the development...
Show moreAs the size, complexity, and functionality of systems we need to model and simulate con-tinue to increase, benefits such as interoperability and reusability enabled by distributed discrete-event simulation are becoming extremely important in many disciplines, not only military but also many engineering disciplines such as distributed manufacturing, supply chain management, and enterprise engineering, etc. In this dissertation we propose a distributed simulation framework for the development of modeling and the simulation of complex systems. The framework is based on the interoperability of a simulation system enabled by distributed simulation and the gateways which enable Com-mercial Off-the-Shelf (COTS) simulation packages to interconnect to the distributed simulation engine. In the case study of modeling Virtual Test Bed (VTB), the framework has been designed as a distributed simulation to facilitate the integrated execution of different simulations, (shuttle process model, Monte Carlo model, Delay and Scrub Model) each of which is addressing differ-ent mission components as well as other non-simulation applications (Weather Expert System and Virtual Range). Although these models were developed independently and at various times, the original purposes have been seamlessly integrated, and interact with each other through Run-time Infrastructure (RTI) to simulate shuttle launch related processes. This study found that with the framework the defining properties of complex systems - interaction and emergence are realized and that the software life cycle models (including the spiral model and prototyping) can be used as metaphors to manage the complexity of modeling and simulation of the system. The system of systems (a complex system is intrinsically a "system of systems") continuously evolves to accomplish its goals, during the evolution subsystems co-ordinate with one another and adapt with environmental factors such as policies, requirements, and objectives. In the case study we first demonstrate how the legacy models developed in COTS simulation languages/packages and non-simulation tools can be integrated to address a compli-cated system of systems. We then describe the techniques that can be used to display the state of remote federates in a local federate in the High Level Architecture (HLA) based distributed simulation using COTS simulation packages.
Show less - Date Issued
- 2005
- Identifier
- CFE0000534, ucf:46416
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000534
- Title
- DATA TRANSMISSION SCHEDULING FOR DISTRIBUTED SIMULATION USING PACKET ALLOYING.
- Creator
-
Vargas-Morales, Juan, DeMara, Ronald, University of Central Florida
- Abstract / Description
-
Communication bandwidth and latency reduction techniques are developed for Distributed Interactive Simulation (DIS) protocols. Using logs from vignettes simulated by the OneSAF Testbed Baseline (OTB), a discrete event simulator based on the OMNeT++ modeling environment is developed to analyze the Protocol Data Unit (PDU) traffic over a wireless flying Local Area Network (LAN). Alternative PDU bundling and compression techniques are studied under various metrics including slack time, travel...
Show moreCommunication bandwidth and latency reduction techniques are developed for Distributed Interactive Simulation (DIS) protocols. Using logs from vignettes simulated by the OneSAF Testbed Baseline (OTB), a discrete event simulator based on the OMNeT++ modeling environment is developed to analyze the Protocol Data Unit (PDU) traffic over a wireless flying Local Area Network (LAN). Alternative PDU bundling and compression techniques are studied under various metrics including slack time, travel time, queue length, and collision rate. Based on these results, Packet Alloying, a technique for the optimized bundling of packets, is proposed and evaluated. Packet Alloying becomes more active when it is needed most: during negative spikes of transmission slack time. It produces aggregations that preserve the internal PDU format, allowing the resulting packets to be subjectable to further bundling and/or compression by conventional techniques. To optimize the selection of bundle delimitation, three online predictive strategies were developed: Neural-Network based, Always-Wait, and Always-Send. These were compared with three offline strategies defined as Type, Type-Length and Type-Length-Size. Applying Always-Wait to the studied vignette using the wireless links set to 64 Kbps, a reduction in the magnitude of negative slack time from -75 to -9 seconds for the worst spike was achieved, which represents a reduction of 88 %. Similarly, at 64 Kbps, Always-Wait reduced the average satellite queue length from 2,963 to 327 messages for a 89% reduction. From the analysis of negative slack-time spikes it was determined which PDU types are of highest priority. The router and satellite queues in the case study were modified accordingly using a priority-based transmission scheduler. The analysis of total travel times based of PDU types numerically shows the benefit obtained. The contributions of this dissertation include the formalization of a selective PDU bundling scheme, the proposal and study of different predictive algorithms for the next PDU, and priority-based optimization using Head-of-Line (HoL) service. These results demonstrate the validity of packet optimizations for distributed simulation environments and other possible applications such as TCP/IP transmissions.
Show less - Date Issued
- 2004
- Identifier
- CFE0000302, ucf:46312
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000302
- Title
- ON THE INCORPORATION OF THE PERSONALITY FACTORS INTO CROWD SIMULATION.
- Creator
-
Jaganathan, Sivakumar, Kincaid, J. Peter, University of Central Florida
- Abstract / Description
-
Recently, a considerable amount of research has been performed on simulating the collective behavior of pedestrians in the street or people finding their way inside a building or a room. Comprehensive reviews of the state of the art can be found in Schreckenberg and Deo (2002) and Batty, M., DeSyllas, J. and Duxbury, E. (2003). In all these simulation studies, one area that is lacking is accounting for the effects of human personalities on the outcome. As a result, there is a growing emphasis...
Show moreRecently, a considerable amount of research has been performed on simulating the collective behavior of pedestrians in the street or people finding their way inside a building or a room. Comprehensive reviews of the state of the art can be found in Schreckenberg and Deo (2002) and Batty, M., DeSyllas, J. and Duxbury, E. (2003). In all these simulation studies, one area that is lacking is accounting for the effects of human personalities on the outcome. As a result, there is a growing emphasis on researching the effects of human personalities and adding the results to the simulations to make them more realistic. This research investigated the possibility of incorporating personality factors into the crowd simulation model. The first part of this study explored the extraction of quantitative crowd motion from videos and developed a method to compare real video with the simulation output video. Several open source programs were examined and modified to obtain optical flow measurements from real videos captured at sporting events. Optical flow measurements provide information such as crowd density, average velocity with which individuals move in the crowd, as well as other parameters. These quantifiable optical flow calculations provided a strong method for comparing simulation results with those obtained from video footage captured in real life situations. The second part of the research focused on the incorporation of the personality factors into the crowd simulation. Existing crowd models such as HelbingU-Molnár-Farkas-Vicsek (HMFV) do not take individual personality factors into account. The most common approach employed by psychologists for studying personality traits is the Big Five factors or dimensions of personality (NEO: Neuroticism, Extroversion, Openness, Agreeableness and Conscientiousness). iii In this research forces related to the personality factors were incorporated into the crowd simulation models. The NEO-based forces were incorporated into an existing HMFV simulated implemented in the MASON simulation framework. The simulation results were validated using the quantification procedures developed in the first phase. This research reports on a major expansion of a simulation of pedestrian motion based on the model (HMFV) by Helbing, D., I. J. Farkas, P. Molnár, and T. Vicsek (2002). Example of actual behavior such as a crowd exiting church after service were simulated using NEO-based forces and show a striking resemblance to actual behavior as rated by behavior scientists.
Show less - Date Issued
- 2007
- Identifier
- CFE0001771, ucf:47276
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001771
- Title
- ASSESSMENT OF THE SAFETY BENEFITS OF VMS AND VSL USING THE UCF DRIVING SIMULATOR.
- Creator
-
Dos Santos, Cristina, Abdel-Aty, Mohamed, University of Central Florida
- Abstract / Description
-
Researchers at the University of Central Florida (UCF) have been working during the past few years on different strategies to improve freeway safety in real-time. An ongoing research at UCF has investigated crash patterns that occurred on a stretch of Interstate-4 located in Orlando, FL and created statistical models to predict in real-time the likelihood of a crash in terms of time and space. The models were then tested using PARAMICS micro-simulation and different strategies that would...
Show moreResearchers at the University of Central Florida (UCF) have been working during the past few years on different strategies to improve freeway safety in real-time. An ongoing research at UCF has investigated crash patterns that occurred on a stretch of Interstate-4 located in Orlando, FL and created statistical models to predict in real-time the likelihood of a crash in terms of time and space. The models were then tested using PARAMICS micro-simulation and different strategies that would reduce the risk of crashes were suggested. One of the main recommended strategies was the use of Variable Speed Limits (VSL) which intervenes by reducing the speed upstream the segment of high risk and increasing the speed downstream. The purpose of this study is to examine the recommendations reached by the micro-simulation using the UCF driving simulator. Drivers' speed behavior in response to changes in speed limits and different information messages are observed. Different scenarios that represent the recommendations from the earlier micro-simulation study and three different messages displayed using Variable Message Signs (VMS) as an added measure to advice drivers about changes in the speed limit were created. In addition, abrupt and gradual changes in speed were tested against the scenarios that maintained the speed limit constant or did include a VSL or VMS in the scenarios' design (base case). Dynamic congestion was also added to the scenarios' design to observe drivers' reactions and speed reductions once drivers approached congestion. A total of 85 subjects were recruited. Gender and age were the controlling variables for the subjects' recruitment. Each of the subjects drove 3 out of a total of 24 scenarios. In addition, a survey was conducted and involved hypothetical questions, including knowledge about VMS and VSL, and questions about their driving behavior. The survey data were useful in identifying the subjects' compliance with the speed limit and VSL/VMS acceptance. Two statistical analytical techniques were performed on the data that were collected from the simulator: ANOVA and PROC MIXED. The ANOVA test was used to investigate if the differences in speed and reaction distances between subjects were statistically significant for each sign compared to the base case. The PROC MIXED analysis was used to investigate the differences of all scenarios (24x24) based on the spot speed data collected for each driver. It was found from the analyses that drivers follow better the message displayed on VMS that informs them that the speed is changing, whether it is or not, strictly enforced as opposed to providing the reason for change or no information. Moreover, an abrupt change in speed produced immediate results; however both abrupt and gradual changes in speed produced the same reduction in speed at the target zone. It was also noticed that most drivers usually drive 5 mph above the speed limit, even though in the survey analysis the majority of them stated that they drive in compliance with the speed limit or with the flow of traffic. This means that if a modest speed reduction of 5 mph is requested they will ignore it, but if a 10 mph reduction is recommended they will reduce the speed by at least 5 mph. Consequently, it was noticed that drivers arrived at the congestion zone with a slower speed than the base speed limit due to the combination of VMS and VSL signage. By having drivers approaching congestion with a slower speed, potential rear-end crashes could be avoided. Comparing the two genders indicated that females are more likely to follow the VMS's recommendations to reduce the speed. Also females in general drive above the speed limit between 2 mph and 3 mph, while males drive above the speed limit between 5 mph and 8 mph. From the analysis of the age factor, it was concluded that drivers from the 16-19 age group drive faster and drivers from the 45 and above age group drive slower, than the drivers from the other groups. In general, all drivers reduced and/or increased their speed accordingly when a VMS and/or VSL was present in the scenario advising for this change in the speed limit. The investigations conducted for this thesis proved that the recommendations suggested previously based on the crash risk model and micro-simulation (Abdel-Aty et al., 2006) aid drivers in reducing their speed before they approach a segment of high risk and by doing so reduce the likelihood of a crash. Finally, the real-time safety benefits of VMS and VSL should be continuously evaluated in future studies.
Show less - Date Issued
- 2007
- Identifier
- CFE0001628, ucf:47167
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001628
- Title
- PHYSICALLY-BASED VISUALIZATION OF RESIDENTIAL BUILDING DAMAGE PROCESS IN HURRICANE.
- Creator
-
Liao, Dezhi, Kincaid, J. Peter, University of Central Florida
- Abstract / Description
-
This research provides realistic techniques to visualize the process of damage to residential building caused by hurricane force winds. Three methods are implemented to make the visualization useful for educating the public about mitigation measures for their homes. First, the underline physics uses Quick Collision Response Calculation. This is an iterative method, which can tune the accuracy and the performance to calculate collision response between building components. Secondly, the damage...
Show moreThis research provides realistic techniques to visualize the process of damage to residential building caused by hurricane force winds. Three methods are implemented to make the visualization useful for educating the public about mitigation measures for their homes. First, the underline physics uses Quick Collision Response Calculation. This is an iterative method, which can tune the accuracy and the performance to calculate collision response between building components. Secondly, the damage process is designed as a Time-scalable Process. By attaching a damage time tag for each building component, the visualization process is treated as a geometry animation allowing users to navigate in the visualization. The detached building components move in response to the wind force that is calculated using qualitative rather than quantitative techniques. The results are acceptable for instructional systems but not for engineering analysis. Quick Damage Prediction is achieved by using a database query instead of using a Monte-Carlo simulation. The database is based on HAZUS® engineering analysis data which gives it validity. A reasoning mechanism based on the definition of the overall building damage in HAZUS® is used to determine the damage state of selected building components including roof cover, roof sheathing, wall, openings and roof-wall connections. Exposure settings of environmental aspects of the simulated environment, such as ocean, trees, cloud and rain are integrated into a scene-graph based graphics engine. Based on the graphics engine and the physics engine, a procedural modeling method is used to efficiently render residential buildings. The resulting program, Hurricane!, is an instructional program for public education useful in schools and museum exhibits.
Show less - Date Issued
- 2007
- Identifier
- CFE0001609, ucf:47190
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001609
- Title
- NETWORK PERFORMANCE MANAGEMENT USING APPLICATION-CENTRIC KEY PERFORMANCE INDICATORS.
- Creator
-
McGill, Susan, Shumaker, Randall, University of Central Florida
- Abstract / Description
-
The Internet and intranets are viewed as capable of supplying "Anything, Anywhere, Anytime" and e-commerce, e-government, e-community, and military C4I are now deploying many and varied applications to serve their needs. Network management is currently centralized in operations centers. To assure customer satisfaction with the network performance they typically plan, configure and monitor the network devices to insure an excess of bandwidth, that is overprovision. If this proves uneconomical...
Show moreThe Internet and intranets are viewed as capable of supplying "Anything, Anywhere, Anytime" and e-commerce, e-government, e-community, and military C4I are now deploying many and varied applications to serve their needs. Network management is currently centralized in operations centers. To assure customer satisfaction with the network performance they typically plan, configure and monitor the network devices to insure an excess of bandwidth, that is overprovision. If this proves uneconomical or if complex and poorly understood interactions of equipment, protocols and application traffic degrade performance creating customer dissatisfaction, another more application-centric, way of managing the network will be needed. This research investigates a new qualitative class of network performance measures derived from the current quantitative metrics known as quality of service (QOS) parameters. The proposed class of qualitative indicators focuses on utilizing current network performance measures (QOS values) to derive abstract quality of experience (QOE) indicators by application class. These measures may provide a more user or application-centric means of assessing network performance even when some individual QOS parameters approach or exceed specified levels. The mathematics of functional analysis suggests treating QOS performance values as a vector, and, by mapping the degradation of the application performance to a characteristic lp-norm curve, a qualitative QOE value (good/poor) can be calculated for each application class. A similar procedure could calculate a QOE node value (satisfactory/unsatisfactory) to represent the service level of the switch or router for the current mix of application traffic. To demonstrate the utility of this approach a discrete event simulation (DES) test-bed, in the OPNET telecommunications simulation environment, was created modeling the topology and traffic of three semi-autonomous networks connected by a backbone. Scenarios, designed to degrade performance by under-provisioning links or nodes, are run to evaluate QOE for an access network. The application classes and traffic load are held constant. Future research would include refinement of the mathematics, many additional simulations and scenarios varying other independent variables. Finally collaboration with researchers in areas as diverse as human computer interaction (HCI), software engineering, teletraffic engineering, and network management will enhance the concepts modeled.
Show less - Date Issued
- 2007
- Identifier
- CFE0001818, ucf:47371
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001818
- Title
- REAL-TIME TREE SIMULATION USING VERLET INTEGRATION.
- Creator
-
Manavi, Bobak, Kincaid, J. Peter, University of Central Florida
- Abstract / Description
-
One of the most important challenges in real-time simulation of large trees and vegetation is the vast number of calculations required to simulate the interactions between all the branches in the tree when external forces are applied to it. This paper will propose the use of algorithms employed by applications like cloth and soft body simulations, where objects can be represented by a finite system of particles connected via spring-like constraints, for the structural representation and...
Show moreOne of the most important challenges in real-time simulation of large trees and vegetation is the vast number of calculations required to simulate the interactions between all the branches in the tree when external forces are applied to it. This paper will propose the use of algorithms employed by applications like cloth and soft body simulations, where objects can be represented by a finite system of particles connected via spring-like constraints, for the structural representation and manipulation of trees in real-time. We will then derive and show the use of Verlet integration and the constraint configuration used for simulating trees while constructing the necessary data structures that encapsulate the procedural creation of these objects. Furthermore, we will utilize this system to simulate branch breakage due to accumulated external and internal pressure.
Show less - Date Issued
- 2007
- Identifier
- CFE0001802, ucf:47381
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001802
- Title
- AN IMPROVED THERMOREGULATORY MODEL FOR COOLING GARMENT APPLICATIONS WITH TRANSIENT METABOLIC RATES.
- Creator
-
Westin, Johan, Kapat, Jayanta, University of Central Florida
- Abstract / Description
-
Current state-of-the-art thermoregulatory models do not predict body temperatures with the accuracies that are required for the development of automatic cooling control in liquid cooling garment (LCG) systems. Automatic cooling control would be beneficial in a variety of space, aviation, military, and industrial environments for optimizing cooling efficiency, for making LCGs as portable and practical as possible, for alleviating the individual from manual cooling control, and for improving...
Show moreCurrent state-of-the-art thermoregulatory models do not predict body temperatures with the accuracies that are required for the development of automatic cooling control in liquid cooling garment (LCG) systems. Automatic cooling control would be beneficial in a variety of space, aviation, military, and industrial environments for optimizing cooling efficiency, for making LCGs as portable and practical as possible, for alleviating the individual from manual cooling control, and for improving thermal comfort and cognitive performance. In this study, we adopt the Fiala thermoregulatory model, which has previously demonstrated state-of-the-art predictive abilities in air environments, for use in LCG environments. We validate the numerical formulation with analytical solutions to the bioheat equation, and find our model to be accurate and stable with a variety of different grid configurations. We then compare the thermoregulatory model's tissue temperature predictions with experimental data where individuals, equipped with an LCG, exercise according to a 700 W rectangular type activity schedule. The root mean square (RMS) deviation between the model response and the mean experimental group response is 0.16°C for the rectal temperature and 0.70°C for the mean skin temperature, which is within state-of-the-art variations. However, with a mean absolute body heat storage error (e_BHS_mean) of 9.7 W·h, the model fails to satisfy the ±6.5 W·h accuracy that is required for the automatic LCG cooling control development. In order to improve model predictions, we modify the blood flow dynamics of the thermoregulatory model. Instead of using step responses to changing requirements, we introduce exponential responses to the muscle blood flow and the vasoconstriction command. We find that such modifications have an insignificant effect on temperature predictions. However, a new vasoconstriction dependency, i.e. the rate of change of hypothalamus temperature weighted by the hypothalamus error signal (DThy·dThy/dt), proves to be an important signal that governs the thermoregulatory response during conditions of simultaneously increasing core and decreasing skin temperatures, which is a common scenario in LCG environments. With the new DThy·dThy/dt dependency in the vasoconstriction command, the e_BHS_mean for the exercise period is reduced by 59% (from 12.9 W·h to 5.2 W·h). Even though the new e_BHS_mean of 5.8 W·h for the total activity schedule is within the target accuracy of ±6.5 W·h, e_BHS fails to stay within the target accuracy during the entire activity schedule. With additional improvements to the central blood pool formulation, the LCG boundary condition, and the agreement between model set-points and actual experimental initial conditions, it seems possible to achieve the strict accuracy that is needed for automatic cooling control development.
Show less - Date Issued
- 2008
- Identifier
- CFE0002460, ucf:47707
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0002460
- Title
- EVALUATING TACTICAL COMBAT CASUALTY CARE TRAINING TREATMENTS EFFECTS ON COMBAT MEDIC TRAINEES IN LIGHT OF SELECT HUMAN DESCRIPTIVE CHARACTERISTICS.
- Creator
-
Sotomayor, Teresita, Proctor, Michael, University of Central Florida
- Abstract / Description
-
The use of military forces in urban operations has increased considerably over the past years. As illustrated by the current conflict in Iraq, the Army finds itself fighting its toughest battles in urban areas facing unconventional forces. Soldiers face many threats in hostile fire environments, whether conducting large-scale mechanized warfare, low-intensity conflicts, or operations other than war. Through 1970, there has been no demonstrable reduction in battlefield mortality rate as a...
Show moreThe use of military forces in urban operations has increased considerably over the past years. As illustrated by the current conflict in Iraq, the Army finds itself fighting its toughest battles in urban areas facing unconventional forces. Soldiers face many threats in hostile fire environments, whether conducting large-scale mechanized warfare, low-intensity conflicts, or operations other than war. Through 1970, there has been no demonstrable reduction in battlefield mortality rate as a percentage of all casualties since data was kept since before the Civil War. For that period of time, nearly all the reduction in overall mortality rate occurred through reduced mortality in Hospital Chain. As of 1970, about 90 percent of all combat deaths occur before a casualty reaches a definitive care facility. Tactical Combat Casualty Care (TCCC), also known as TC3, is the pre-hospital care rendered to a casualty in a combat environment. The application of TCCC principles during a tactical combat environment has proven highly effective and is a major reason why combat deaths in latest conflicts (Operation Iraqi Freedom and Operation Enduring Freedom) are lower than in any other conflict in the history of the United States. The Army continues to emphasize reducing battlefield mortality rate. Current tools and methods used for initial skills and sustainment training of combat medics throughout the Army are insufficient. New technologies are needed to provide medics with greater opportunities to develop and test their decision making and technical medical skills in multiple, COE-relevant, training scenarios. In order to address some of these requirements, the U.S. Army Research Development and Engineering Command, Simulation and Training Technology Center (RDECOM-STTC) is developing the 68W Tactical Combat Casualty Care Simulation (TC3 Sim) for the US Army Medical Department (AMEDD) Center & School at Fort Sam Houston. The Army is considering the use of the TC3 Sim game as a tool to improve the training of individual Soldiers as well as improve the readiness of combat medics. It is the intent of this research to evaluate the effectiveness of instructional games in general and the use of the TC3 game in particular for teaching the concepts of tactical combat casualty care. Experiments were conducted to evaluate the training effectiveness of this tool in supporting the 68W10 Healthcare Specialist Course program of instruction (POI). The goal of this research is to address important questions such as: Is this game an effective tool to train Soldiers the aspects of TC3? Can knowledge gain through the use of the simulation be transferred into task related situations? How can this tool be incorporated in the current POI in order to increase training effectiveness?
Show less - Date Issued
- 2008
- Identifier
- CFE0002396, ucf:47755
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0002396
- Title
- NETWORK INTRUSION DETECTION: MONITORING, SIMULATION ANDVISUALIZATION.
- Creator
-
Zhou, Mian, Lang, Sheau-Dong, University of Central Florida
- Abstract / Description
-
This dissertation presents our work on network intrusion detection and intrusion sim- ulation. The work in intrusion detection consists of two different network anomaly-based approaches. The work in intrusion simulation introduces a model using explicit traffic gen- eration for the packet level traffic simulation. The process of anomaly detection is to first build profiles for the normal network activity and then mark any events or activities that deviate from the normal profiles as...
Show moreThis dissertation presents our work on network intrusion detection and intrusion sim- ulation. The work in intrusion detection consists of two different network anomaly-based approaches. The work in intrusion simulation introduces a model using explicit traffic gen- eration for the packet level traffic simulation. The process of anomaly detection is to first build profiles for the normal network activity and then mark any events or activities that deviate from the normal profiles as suspicious. Based on the different schemes of creating the normal activity profiles, we introduce two approaches for intrusion detection. The first one is a frequency-based approach which creates a normal frequency profile based on the periodical patterns existed in the time-series formed by the traffic. It aims at those attacks that are conducted by running pre-written scripts, which automate the process of attempting connections to various ports or sending packets with fabricated payloads, etc. The second approach builds the normal profile based on variations of connection-based behavior of each single computer. The deviations resulted from each individual computer are carried out by a weight assignment scheme and further used to build a weighted link graph representing the overall traffic abnormalities. The functionality of this system is of a distributed personal IDS system that also provides a centralized traffic analysis by graphical visualization. It provides a finer control over the internal network by focusing on connection-based behavior of each single computer. For network intrusion simulation, we explore an alternative method for network traffic simulation using explicit traffic generation. In particular, we build a model to replay the standard DARPA traffic data or the traffic data captured from a real environment. The replayed traffic data is mixed with the attacks, such as DOS and Probe attack, which can create apparent abnormal traffic flow patterns. With the explicit traffic generation, every packet that has ever been sent by the victim and attacker is formed in the simulation model and travels around strictly following the criteria of time and path that extracted from the real scenario. Thus, the model provides a promising aid in the study of intrusion detection techniques.
Show less - Date Issued
- 2005
- Identifier
- CFE0000679, ucf:46484
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000679
- Title
- SIMULATION STUDIES OF SELF-ASSEMBLY AND PHASE DIAGRAMOF AMPHIPHILIC MOLECULES.
- Creator
-
Bourov, Geuorgui, Bhattacharya, Aniket, University of Central Florida
- Abstract / Description
-
The aim of this dissertation is to investigate self-assembled structures and the phase diagram of amphiphilic molecules of diverse geometric shapes using a number of different computer simulation methods. The semi-realistic coarse-grained model, used extensively for simulation of polymers and surfactant molecules, is adopted in an off-lattice approach to study how the geometric structure of amphiphiles affects the aggregation properties. The results of simulations show that the model system...
Show moreThe aim of this dissertation is to investigate self-assembled structures and the phase diagram of amphiphilic molecules of diverse geometric shapes using a number of different computer simulation methods. The semi-realistic coarse-grained model, used extensively for simulation of polymers and surfactant molecules, is adopted in an off-lattice approach to study how the geometric structure of amphiphiles affects the aggregation properties. The results of simulations show that the model system behavior is consistent with theoretical predictions, experiments and lattice simulation models. We demonstrate that by modifying the geometry of the molecules, self-assembled aggregates are altered in a way close to theoretical predictions. In several two and three dimensional off-lattice Brownian Dynamics simulations, the influence of the shape of the amphiphilic molecules on the size and form of the aggregates is studied systematically. Model phospholipid molecules, with two hydrophobic chains connected to one hydrophilic head group, are simulated and the formation of stable bilayers is observed. In addition, (practically very important) mixtures of amphiphiles with diverse structures are studied under different mixing ratios and molecular structures. We find that in several systems, with Poisson distributed chain lengths, the effect on the aggregation distribution is negligible compared to that of the pure amphiphilic system with the mean length of the Poisson distribution. The phase diagrams of different amphiphilic molecular structures are investigated in separate simulations by employing the Gibbs Ensemble Monte Carlo method with an implemented configurational-bias technique. The computer simulations of the above mentioned amphiphilic systems are done in an area where physics, biology and chemistry are closely connected and advances in applications require the use of new theoretical, experimental and simulation methods for a better understanding of their self-assembling properties. Obtained simulation results demonstrate the connection between the structure of amphiphilic molecules and the properties of their thermodynamically stable aggregates and thus build a foundation for many applications of the remarkable phenomena of amphiphilic self-assembly in the area of nanotechnology.
Show less - Date Issued
- 2005
- Identifier
- CFE0000695, ucf:46491
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000695
- Title
- A THEORETICAL APPROACH TO ASSESSING ANNUAL ENERGY BALANCE IN GRAY WHALES (ESCHRICHTIUS ROBUSTUS).
- Creator
-
Greenwald, Nathalie, Worthy, Graham, University of Central Florida
- Abstract / Description
-
While direct measurements of energetic demands are nearly impossible to collect on large cetaceans, comprehensive bioenergetic models can give insights on such parameters by combining physiological and ecological knowledge. This model was developed to estimate necessary food intake of gray whales, Eschrichtius robustus, of the Eastern North Pacific stock. Field Metabolic Rates (FMR) for gray whales were first estimated based on various assumptions (e.g. volumetric representation of gray...
Show moreWhile direct measurements of energetic demands are nearly impossible to collect on large cetaceans, comprehensive bioenergetic models can give insights on such parameters by combining physiological and ecological knowledge. This model was developed to estimate necessary food intake of gray whales, Eschrichtius robustus, of the Eastern North Pacific stock. Field Metabolic Rates (FMR) for gray whales were first estimated based on various assumptions (e.g. volumetric representation of gray whales, extent of their feeding season, and blubber depth distribution) using morphometric data, energetic costs, and food assimilation according to age and gender specific requirements. Food intake rates for gray whales of varying maturity and gender were then estimated based on FMR and caloric value of prey and compared to food intake rates of previous studies. Monte Carlo simulations and sensitivity analysis were performed to assess the model's predictions compared to observed field data from previous studies. Predicted average food intakes for adult male, pregnant/ lactating female, and immature whales were 475 ± 300, 525 ± 300 and 600 ± 300 kg d-1, respectively. Estimated blubber depths resulting from these food intakes were comparable to field data obtained from whaling data. Sensitivity analysis indicated food intake, from all parameters, as having the highest impact on the percent change in ending mass from a simulation. These food intake estimates are similar to those found in a previous study and fall within the range of food intake per body mass observed in other species of cetaceans. Though thermoregulation can be a factor in some cetaceans, it appears not to be an additional cost for gray whales as the present model's predicted lower critical temperatures for the whales (TLC) were below ambient temperatures. With temperatures increasing in the Bering Sea, the main prey of gray whales, ampeliscid amphipods, could be adversely affected, possibly resulting in increased food shortages leading to a surge in gray whale strandings.
Show less - Date Issued
- 2005
- Identifier
- CFE0000560, ucf:46442
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000560
- Title
- EVALUATION OF THE IMPACTS OF ITS INFORMATION STRATEGIES ON I-4 CORRIDOR.
- Creator
-
Zuo, Yueliang, Al-Deek, Haitham M., University of Central Florida
- Abstract / Description
-
This study evaluated the impacts of ITS information strategies under incident conditions in Interstate 4 (I-4) corridor of Orlando. The analysis was performed using DYNASMART-P software package. The ITS information strategies range from pre-trip information, en-route information, and variable message signs. Simulation covered one hour during the morning peak period. The impacts of ITS information strategies on mobility were evaluated by simulating the performance of various ITS information...
Show moreThis study evaluated the impacts of ITS information strategies under incident conditions in Interstate 4 (I-4) corridor of Orlando. The analysis was performed using DYNASMART-P software package. The ITS information strategies range from pre-trip information, en-route information, and variable message signs. Simulation covered one hour during the morning peak period. The impacts of ITS information strategies on mobility were evaluated by simulating the performance of various ITS information components (pre-trip information, en-route information, and variable message signs) under incident conditions for the I-4 corridor and comparing the results with the corresponding scenarios in the absence of these components. The traffic flow relations were calibrated against the flow measurements along freeway to determine model parameters. An effort was made to validate estimated traffic volumes against measured link counts. The archived I-4 data at the Center for Advanced Transportation Systems Simulation (CATSS) at the University of Central Florida was used for both calibration and validation. The analysis indicated that DYNASMART-P was able to adequately reproduce the observed morning peak hourly flows over suitably selected locations.Ten scenarios were designed to evaluate the benefits of ITS information strategies under incident conditions. The results indicated that these ITS traveler information technologies can result in great travel time saving. It was found that commuters who use traveler information via the pre-trip information or en-route information to switch their routes benefit significantly in terms of delay reduction when incidents occur. It was found that there exists an optimal value for the fraction users with information at which the network performs best. This optimal fraction may be different for different source of information. Also this may vary with different incidents. This study demonstrates how one can realistically simulate the network under various scenarios without actually conducting the high cost operational tests. DYNASMART-P can produce useful variables such as speeds, travel time, queue lengths, and stop time to better assess the impacts of ITS components. It can be applied in ITS equipped networks.
Show less - Date Issued
- 2004
- Identifier
- CFE0000107, ucf:46199
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000107
- Title
- ANALYSIS AND INTEGRATION OF A DEBRIS MODEL IN THE VIRTUAL RANGE PROJECT.
- Creator
-
Robledo, Luis, Sepulveda, Jose, University of Central Florida
- Abstract / Description
-
After the accident of the STS 107 Columbia Space Shuttle, great concern has been focused on the risk associated to the population on the ground. Before this accident happened, re-entry routes as well as risk calculation of were not of public concern. Two issues that have been raised from this lamentable accident relate to spacecraft security and to public safety. The integration of a debris model has been part of the original conceptual architecture of the Virtual Range Project. Its...
Show moreAfter the accident of the STS 107 Columbia Space Shuttle, great concern has been focused on the risk associated to the population on the ground. Before this accident happened, re-entry routes as well as risk calculation of were not of public concern. Two issues that have been raised from this lamentable accident relate to spacecraft security and to public safety. The integration of a debris model has been part of the original conceptual architecture of the Virtual Range Project. Its integration has been considered as a specific research due to the complexity of the models and the difficulties to obtain them since the commercial off-the-shelf available software seems to be less accessible. This research provides solid information concerning what debris fragmentation models are, their fundamentals, their weaknesses and strengths. The research provides information of the main debris models being currently used by NASA which have direct relationship with the space programs conducted. This study also addresses the integration of a debris model into the Virtual Range Project. We created a provisional model based on the distribution of the Columbia debris fragments over Texas and part of Louisiana in order to create an analytical methodology as well. This analysis shows a way of integrating this debris model with a Geographic Information System as well as the integration of several raster and vector data sets which will provide the source data to compute the calculations. This research uses population data sets that allow the determination of the number of people at risk on the ground. The graphical and numerical analysis made can lead to the determination of new and more secure re-entry trajectories as well as further population-related security issues concerning this type of flights.
Show less - Date Issued
- 2004
- Identifier
- CFE0000193, ucf:46175
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000193
- Title
- CREATING MODELS OF INTERNET BACKGROUND TRAFFIC SUITABLE FOR USE IN EVALUATING NETWORK INTRUSION DETECTION SYSTEMS.
- Creator
-
LUO, SONG, Marin, Gerald, University of Central Florida
- Abstract / Description
-
This dissertation addresses Internet background traffic generation and network intrusion detection. It is organized in two parts. Part one introduces a method to model realistic Internet background traffic and demonstrates how the models are used both in a simulation environment and in a lab environment. Part two introduces two different NID (Network Intrusion Detection) techniques and evaluates them using the modeled background traffic. To demonstrate the approach we modeled five major...
Show moreThis dissertation addresses Internet background traffic generation and network intrusion detection. It is organized in two parts. Part one introduces a method to model realistic Internet background traffic and demonstrates how the models are used both in a simulation environment and in a lab environment. Part two introduces two different NID (Network Intrusion Detection) techniques and evaluates them using the modeled background traffic. To demonstrate the approach we modeled five major application layer protocols: HTTP, FTP, SSH, SMTP and POP3. The model of each protocol includes an empirical probability distribution plus estimates of application-specific parameters. Due to the complexity of the traffic, hybrid distributions (called mixture distributions) were sometimes required. The traffic models are demonstrated in two environments: NS-2 (a simulator) and HONEST (a lab environment). The simulation results are compared against the original captured data sets. Users of HONEST have the option of adding network attacks to the background. The dissertation also introduces two new template-based techniques for network intrusion detection. One is based on a template of autocorrelations of the investigated traffic, while the other uses a template of correlation integrals. Detection experiments have been performed on real traffic and attacks; the results show that the two techniques can achieve high detection probability and low false alarm in certain instances.
Show less - Date Issued
- 2005
- Identifier
- CFE0000852, ucf:46667
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000852
- Title
- FINITE ELEMENT SIMULATION OF REPAIR OF DELAMINATED COMPOSITE STRUCTURES USING PIEZOELECTRIC LAYERS.
- Creator
-
Navale, Kunal, Wang, Quan, University of Central Florida
- Abstract / Description
-
Damage in composite material fabricated aerospace, aeronautical, mechanical, civil and offshore structures often results from factors such as fatigue, corrosion and accidents. Such damage when left unattended can grow at an alarming rate due to the singularity of the stress and strain in the vicinity of the damage. It can lead to increase in the vibration level, reduction in the load carrying capacity, deterioration in the normal performance of the component and even catastrophic failure. In...
Show moreDamage in composite material fabricated aerospace, aeronautical, mechanical, civil and offshore structures often results from factors such as fatigue, corrosion and accidents. Such damage when left unattended can grow at an alarming rate due to the singularity of the stress and strain in the vicinity of the damage. It can lead to increase in the vibration level, reduction in the load carrying capacity, deterioration in the normal performance of the component and even catastrophic failure. In most conditions, the service life of damaged components is extended with repair instead of immediate replacement. Effective repair of structural damage is therefore an important and practical topic. Repair can extend the service life and can be a cost efficient alternative to immediate replacement of the damaged component. Most conventional repair methods involve welding, riveting or mounting additional patches on the parent structure without removing the damaged portion. These methods tend to be passive and inflexible, faced with the limitations of adjusting the repair to the changes in external loads.Besides, in certain cases these methods may lead to additional damage to the structure. For example, the in-situ drilling required in some cases can cause damage to items such as hidden or exposed hydraulic lines and electrical cables. Welding or bonding patches can cause significant stress alterations and serious stress corrosion problems, apart from burdening the weight sensitive structures. Above all, effective repair applying conventional analytical methods hinges on calculation of the singularity of stress and strain in the vicinity of the damage, which is be a difficult as only approximate solutions are available. Thus, a need is felt to update the repair methods with the advancement in fields of materials, sensing and actuating. This can make the repair more effective and efficient than conventional repair methodology. Current research proposes the use of piezoelectric materials in repair of delaminated composite structures. A detailed mechanics analysis of the delaminated beams, subjected to concentrated static loads and axial compressive loads, is presented. The discontinuity of shear stresses induced at delamination tips due to bending of the beams, under action of concentrated static load and axially compressive load, is studied. This discontinuity of the shear stresses normally leads to the sliding mode of fracture of the beam structures. In order to ensure proper functioning of these beam structures, electromechanical characteristics of piezoelectric materials are employed for their repair. Numerical simulations are conducted to calculate the repair voltage to be applied to the piezoelectric patches to erase the discontinuity of horizontal shear stress at the delamination tips and thus, render the beam repaired. The variation of repair voltage with location and size of the delamination is considered. FE simulations are performed to validate the numerically calculated voltage values. The research presented serves to provide information on the design of piezoelectric materials for the repair of delaminated composite structures.
Show less - Date Issued
- 2005
- Identifier
- CFE0000873, ucf:46662
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000873
- Title
- MATHEMATICAL MODELING OF SMALLPOX WITHOPTIMAL INTERVENTION POLICY.
- Creator
-
LAWOT, NIWAS, ROLLINS, DAVID, University of Central Florida
- Abstract / Description
-
In this work, two differential equation models for smallpox are numerically solved to find the optimal intervention policy. In each model we look for the range of values of the parameters that give rise to the worst case scenarios. Since the scale of an epidemic is determined by the number of people infected, and eventually dead, as a result of infection, we attempt to quantify the scale of the epidemic and recommend the optimum intervention policy. In the first case study, we mimic a densely...
Show moreIn this work, two differential equation models for smallpox are numerically solved to find the optimal intervention policy. In each model we look for the range of values of the parameters that give rise to the worst case scenarios. Since the scale of an epidemic is determined by the number of people infected, and eventually dead, as a result of infection, we attempt to quantify the scale of the epidemic and recommend the optimum intervention policy. In the first case study, we mimic a densely populated city with comparatively big tourist population, and heavily used mass transportation system. A mathematical model for the transmission of smallpox is formulated, and numerically solved. In the second case study, we incorporate five different stages of infection: (1) susceptible (2) infected but asymptomatic, non infectious, and vaccine-sensitive; (3) infected but asymptomatic, noninfectious, and vaccine-in-sensitive; (4) infected but asymptomatic, and infectious; and (5) symptomatic and isolated. Exponential probability distribution is used for modeling this case. We compare outcomes of mass vaccination and trace vaccination on the final size of the epidemic.
Show less - Date Issued
- 2006
- Identifier
- CFE0001193, ucf:46848
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001193
- Title
- DIGITAL CONTROLLER IMPLEMENTATION FOR DISTURBANCE REJECTION IN THE OPTICAL COUPLING OF A MOBILE EXPERIMENTAL LASER TRACKING SYSTEM.
- Creator
-
Rhodes, Matthew, Richie, Samuel, University of Central Florida
- Abstract / Description
-
Laser tracking systems are an important aspect of the NASA space program, in particular for conducting research in relation to satellites and space port launch vehicles. Often, launches are conducted at remote sites which require all of the test equipment, including the laser tracking systems, to be portable. Portable systems are more susceptible to environmental disturbances which affect the overall tracking resolution, and consequently, the resolution of any other experimental data being...
Show moreLaser tracking systems are an important aspect of the NASA space program, in particular for conducting research in relation to satellites and space port launch vehicles. Often, launches are conducted at remote sites which require all of the test equipment, including the laser tracking systems, to be portable. Portable systems are more susceptible to environmental disturbances which affect the overall tracking resolution, and consequently, the resolution of any other experimental data being collected at any given time. This research characterizes the optical coupling between two systems in a Mobile Experimental Laser Tracking system and evaluates several control solutions to minimize disturbances within this coupling. A simulation of the optical path was developed in an extensible manner such that different control systems could be easily implemented. For an initial test, several PID controllers were utilized in parallel in order to control mirrors in the optical coupling. Despite many limiting factors of the hardware, a simple proportional control performed to expectations. Although a system implementation was never field tested, the simulation results provide the necessary insight to develop the system further. Recommendations were made for future system modifications which would allow an even higher tracking resolution.
Show less - Date Issued
- 2006
- Identifier
- CFE0001168, ucf:46873
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001168
- Title
- STUDY OF LOW SPEED TRANSITIONAL REGIME GAS FLOWS IN MICROCHANNELS USING INFORMATION PRESERVATION (IP) METHOD.
- Creator
-
KURSUN, Umit, Kapat, Jayanta, University of Central Florida
- Abstract / Description
-
Proper design of thermal management solutions for future nano-scale electronics or photonics will require knowledge of flow and transport through micron-scale ducts. As in the macro-scale conventional counterparts, such micron-scale flow systems would require robust simulation tools for early-stage design iterations. It can be envisioned that an ideal Nanoscale thermal management (NSTM) solution will involve two-phase flow, liquid flow and gas flow. This study focuses on numerical simulation...
Show moreProper design of thermal management solutions for future nano-scale electronics or photonics will require knowledge of flow and transport through micron-scale ducts. As in the macro-scale conventional counterparts, such micron-scale flow systems would require robust simulation tools for early-stage design iterations. It can be envisioned that an ideal Nanoscale thermal management (NSTM) solution will involve two-phase flow, liquid flow and gas flow. This study focuses on numerical simulation gas flow in microchannels as a fundamental thermal management technique in any future NSTM solution. A well-known particle-based method, Direct Simulation Monte Carlo (DSMC) is selected as the simulation tool. Unlike continuum based equations which would fail at large Kn numbers, the DSMC method is valid in all Knudsen regimes. Due to its conceptual simplicity and flexibility, DSMC has a lot of potential and has already given satisfactory answers to a broad range of macroscopic problems. It has also a lot of potential in handling complex MEMS flow problems with ease. However, the high-level statistical noise in DSMC must be eliminated and pressure boundary conditions must be effectively implemented in order to utilize the DSMC under subsonic flow conditions. The statistical noise of classical DSMC can be eliminated trough the use of IP method. The method saves computational time by several orders of magnitude compared to a similar DSMC simulation. As in the regular DSMC procedures, the molecular velocity is used to determine the molecular positions and compute collisions. Separating the macroscopic velocity from the molecular velocity through the use of the IP method, however, eliminates the high-level of statistical noise as typical in DSMC calculations of low-speed flows. The conventional boundary conditions of the classical DSMC method, such as constant velocity free-stream and vacuum conditions are incorrect in subsonic flow conditions. There should be a substantial amount of backpressure allowing new molecules to enter from the outlet as well as inlet boundaries. Additionally, the application of pressure boundaries will facilitate comparison of numerical and experimental results more readily. Therefore, the main aim of this study is to build the unidirectional, non-isothermal IP algorithm method with periodic boundary conditions on the two dimensional classical DSMC algorithm. The IP algorithm is further modified to implement pressure boundary conditions using the method of characteristics. The applicability of the final algorithm in solving a real flow situation is verified on parallel plate Poiseuille and backward facing step flows in microchannels which are established benchmark problems in computational fluid dynamics studies. The backward facing step geometry is also of practical importance in a variety of engineering applications including Integrated Circuit (IC) design. Such an investigation in microchannels with sufficient accuracy may provide insight into the more complex flow and transport processes in any future Nanoscale thermal management (NSTM) solution. The flow and heat transfer mechanisms at different Knudsen numbers are investigated.
Show less - Date Issued
- 2006
- Identifier
- CFE0001281, ucf:46910
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001281