Current Search: Genetic Programming (x)
View All Items
- Title
- A NEAT APPROACH TO GENETIC PROGRAMMING.
- Creator
-
Rodriguez, Adelein, Wu, Annie, University of Central Florida
- Abstract / Description
-
The evolution of explicitly represented topologies such as graphs involves devising methods for mutating, comparing and combining structures in meaningful ways and identifying and maintaining the necessary topological diversity. Research has been conducted in the area of the evolution of trees in genetic programming and of neural networks and some of these problems have been addressed independently by the different research communities. In the domain of neural networks, NEAT (Neuroevolution...
Show moreThe evolution of explicitly represented topologies such as graphs involves devising methods for mutating, comparing and combining structures in meaningful ways and identifying and maintaining the necessary topological diversity. Research has been conducted in the area of the evolution of trees in genetic programming and of neural networks and some of these problems have been addressed independently by the different research communities. In the domain of neural networks, NEAT (Neuroevolution of Augmenting Topologies) has shown to be a successful method for evolving increasingly complex networks. This system's success is based on three interrelated elements: speciation, marking of historical information in topologies, and initializing search in a small structures search space. This provides the dynamics necessary for the exploration of diverse solution spaces at once and a way to discriminate between different structures. Although different representations have emerged in the area of genetic programming, the study of the tree representation has remained of interest in great part because of its mapping to programming languages and also because of the observed phenomenon of unnecessary code growth or bloat which hinders performance. The structural similarity between trees and neural networks poses an interesting question: Is it possible to apply the techniques from NEAT to the evolution of trees and if so, how does it affect performance and the dynamics of code growth? In this work we address these questions and present analogous techniques to those in NEAT for genetic programming.
Show less - Date Issued
- 2007
- Identifier
- CFE0001971, ucf:47451
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001971
- Title
- MULTISENSOR FUSION REMOTE SENSING TECHNOLOGY FOR ASSESSING MULTITEMPORAL RESPONSES IN ECOHYDROLOGICAL SYSTEMS.
- Creator
-
Makkeasorn, Ammarin, Chang, Ni-Bin, University of Central Florida
- Abstract / Description
-
Earth ecosystems and environment have been changing rapidly due to the advanced technologies and developments of humans. Impacts caused by human activities and developments are difficult to acquire for evaluations due to the rapid changes. Remote sensing (RS) technology has been implemented for environmental managements. A new and promising trend in remote sensing for environment is widely used to measure and monitor the earth environment and its changes. RS allows large-scaled measurements...
Show moreEarth ecosystems and environment have been changing rapidly due to the advanced technologies and developments of humans. Impacts caused by human activities and developments are difficult to acquire for evaluations due to the rapid changes. Remote sensing (RS) technology has been implemented for environmental managements. A new and promising trend in remote sensing for environment is widely used to measure and monitor the earth environment and its changes. RS allows large-scaled measurements over a large region within a very short period of time. Continuous and repeatable measurements are the very indispensable features of RS. Soil moisture is a critical element in the hydrological cycle especially in a semiarid or arid region. Point measurement to comprehend the soil moisture distribution contiguously in a vast watershed is difficult because the soil moisture patterns might greatly vary temporally and spatially. Space-borne radar imaging satellites have been popular because they have the capability to exhibit all weather observations. Yet the estimation methods of soil moisture based on the active or passive satellite imageries remain uncertain. This study aims at presenting a systematic soil moisture estimation method for the Choke Canyon Reservoir Watershed (CCRW), a semiarid watershed with an area of over 14,200 km2 in south Texas. With the aid of five corner reflectors, the RADARSAT-1 Synthetic Aperture Radar (SAR) imageries of the study area acquired in April and September 2004 were processed by both radiometric and geometric calibrations at first. New soil moisture estimation models derived by genetic programming (GP) technique were then developed and applied to support the soil moisture distribution analysis. The GP-based nonlinear function derived in the evolutionary process uniquely links a series of crucial topographic and geographic features. Included in this process are slope, aspect, vegetation cover, and soil permeability to compliment the well-calibrated SAR data. Research indicates that the novel application of GP proved useful for generating a highly nonlinear structure in regression regime, which exhibits very strong correlations statistically between the model estimates and the ground truth measurements (volumetric water content) on the basis of the unseen data sets. In an effort to produce the soil moisture distributions over seasons, it eventually leads to characterizing local- to regional-scale soil moisture variability and performing the possible estimation of water storages of the terrestrial hydrosphere. A new evolutionary computational, supervised classification scheme (Riparian Classification Algorithm, RICAL) was developed and used to identify the change of riparian zones in a semi-arid watershed temporally and spatially. The case study uniquely demonstrates an effort to incorporating both vegetation index and soil moisture estimates based on Landsat 5 TM and RADARSAT-1 imageries while trying to improve the riparian classification in the Choke Canyon Reservoir Watershed (CCRW), South Texas. The CCRW was selected as the study area contributing to the reservoir, which is mostly agricultural and range land in a semi-arid coastal environment. This makes the change detection of riparian buffers significant due to their interception capability of non-point source impacts within the riparian buffer zones and the maintenance of ecosystem integrity region wide. The estimation of soil moisture based on RADARSAT-1 Synthetic Aperture Radar (SAR) satellite imagery as previously developed was used. Eight commonly used vegetation indices were calculated from the reflectance obtained from Landsat 5 TM satellite images. The vegetation indices were individually used to classify vegetation cover in association with genetic programming algorithm. The soil moisture and vegetation indices were integrated into Landsat TM images based on a pre-pixel channel approach for riparian classification. Two different classification algorithms were used including genetic programming, and a combination of ISODATA and maximum likelihood supervised classification. The white box feature of genetic programming revealed the comparative advantage of all input parameters. The GP algorithm yielded more than 90% accuracy, based on unseen ground data, using vegetation index and Landsat reflectance band 1, 2, 3, and 4. The detection of changes in the buffer zone was proved to be technically feasible with high accuracy. Overall, the development of the RICAL algorithm may lead to the formulation of more effective management strategies for the handling of non-point source pollution control, bird habitat monitoring, and grazing and live stock management in the future. Soil properties, landscapes, channels, fault lines, erosion/deposition patches, and bedload transport history show geologic and geomorphologic features in a variety of watersheds. In response to these unique watershed characteristics, the hydrology of large-scale watersheds is often very complex. Precipitation, infiltration and percolation, stream flow, plant transpiration, soil moisture changes, and groundwater recharge are intimately related with each other to form water balance dynamics on the surface of these watersheds. Within this chapter, depicted is an optimal site selection technology using a grey integer programming (GIP) model to assimilate remote sensing-based geo-environmental patterns in an uncertain environment with respect to some technical and resources constraints. It enables us to retrieve the hydrological trends and pinpoint the most critical locations for the deployment of monitoring stations in a vast watershed. Geo-environmental information amassed in this study includes soil permeability, surface temperature, soil moisture, precipitation, leaf area index (LAI) and normalized difference vegetation index (NDVI). With the aid of a remote sensingbased GIP analysis, only five locations out of more than 800 candidate sites were selected by the spatial analysis, and then confirmed by a field investigation. The methodology developed in this remote sensing-based GIP analysis will significantly advance the state-of-the-art technology in optimum arrangement/distribution of water sensor platforms for maximum sensing coverage and information-extraction capacity. Effective water resources management is a critically important priority across the globe. While water scarcity limits the uses of water in many ways, floods also have caused so many damages and lives. To more efficiently use the limited amount of water or to resourcefully provide adequate time for flood warning, the results have led us to seek advanced techniques for improving streamflow forecasting. The objective of this section of research is to incorporate sea surface temperature (SST), Next Generation Radar (NEXRAD) and meteorological characteristics with historical stream data to forecast the actual streamflow using genetic programming. This study case concerns the forecasting of stream discharge of a complex-terrain, semi-arid watershed. This study elicits microclimatological factors and the resultant stream flow rate in river system given the influence of dynamic basin features such as soil moisture, soil temperature, ambient relative humidity, air temperature, sea surface temperature, and precipitation. Evaluations of the forecasting results are expressed in terms of the percentage error (PE), the root-mean-square error (RMSE), and the square of the Pearson product moment correlation coefficient (r-squared value). The developed models can predict streamflow with very good accuracy with an r-square of 0.84 and PE of 1% for a 30-day prediction.
Show less - Date Issued
- 2007
- Identifier
- CFE0001767, ucf:47267
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001767
- Title
- FALCONET: FORCE-FEEDBACK APPROACH FOR LEARNING FROM COACHING AND OBSERVATION USING NATURAL AND EXPERIENTIAL TRAINING.
- Creator
-
Stein, Gary, Gonzalez, Avelino, University of Central Florida
- Abstract / Description
-
Building an intelligent agent model from scratch is a difficult task. Thus, it would be preferable to have an automated process perform this task. There have been many manual and automatic techniques, however, each of these has various issues with obtaining, organizing, or making use of the data. Additionally, it can be difficult to get perfect data or, once the data is obtained, impractical to get a human subject to explain why some action was performed. Because of these problems, machine...
Show moreBuilding an intelligent agent model from scratch is a difficult task. Thus, it would be preferable to have an automated process perform this task. There have been many manual and automatic techniques, however, each of these has various issues with obtaining, organizing, or making use of the data. Additionally, it can be difficult to get perfect data or, once the data is obtained, impractical to get a human subject to explain why some action was performed. Because of these problems, machine learning from observation emerged to produce agent models based on observational data. Learning from observation uses unobtrusive and purely observable information to construct an agent that behaves similarly to the observed human. Typically, an observational system builds an agent only based on prerecorded observations. This type of system works well with respect to agent creation, but lacks the ability to be trained and updated on-line. To overcome these deficiencies, the proposed system works by adding an augmented force-feedback system of training that senses the agents intentions haptically. Furthermore, because not all possible situations can be observed or directly trained, a third stage of learning from practice is added for the agent to gain additional knowledge for a particular mission. These stages of learning mimic the natural way a human might learn a task by first watching the task being performed, then being coached to improve, and finally practicing to self improve. The hypothesis is that a system that is initially trained using human recorded data (Observational), then tuned and adjusted using force-feedback (Instructional), and then allowed to perform the task in different situations (Experiential) will be better than any individual step or combination of steps.
Show less - Date Issued
- 2009
- Identifier
- CFE0002746, ucf:48157
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0002746
- Title
- EVOLVING MODELS FROM OBSERVED HUMAN PERFORMANCE.
- Creator
-
Fernlund, Hans Karl Gustav, Gonzalez, Avelino J., University of Central Florida
- Abstract / Description
-
To create a realistic environment, many simulations require simulated agents with human behavior patterns. Manually creating such agents with realistic behavior is often a tedious and time-consuming task. This dissertation describes a new approach that automatically builds human behavior models for simulated agents by observing human performance. The research described in this dissertation synergistically combines Context-Based Reasoning, a paradigm especially developed to model tactical...
Show moreTo create a realistic environment, many simulations require simulated agents with human behavior patterns. Manually creating such agents with realistic behavior is often a tedious and time-consuming task. This dissertation describes a new approach that automatically builds human behavior models for simulated agents by observing human performance. The research described in this dissertation synergistically combines Context-Based Reasoning, a paradigm especially developed to model tactical human performance within simulated agents, with Genetic Programming, a machine learning algorithm to construct the behavior knowledge in accordance to the paradigm. This synergistic combination of well-documented AI methodologies has resulted in a new algorithm that effectively and automatically builds simulated agents with human behavior. This algorithm was tested extensively with five different simulated agents created by observing the performance of five humans driving an automobile simulator. The agents show not only the ability/capability to automatically learn and generalize the behavior of the human observed, but they also capture some of the personal behavior patterns observed among the five humans. Furthermore, the agents exhibited a performance that was at least as good as agents developed manually by a knowledgeable engineer.
Show less - Date Issued
- 2004
- Identifier
- CFE0000013, ucf:46068
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000013
- Title
- REMOTE SENSING WITH COMPUTATIONAL INTELLIGENCE MODELLING FOR MONITORING THE ECOSYSTEM STATE AND HYDRAULIC PATTERN IN A CONSTRUCTED WETLAND.
- Creator
-
Mohiuddin, Golam, Chang, Ni-bin, Lee, Woo Hyoung, Wanielista, Martin, University of Central Florida
- Abstract / Description
-
Monitoring the heterogeneous aquatic environment such as the Stormwater Treatment Areas (STAs) located at the northeast of the Everglades is extremely important in understanding the land processes of the constructed wetland in its capacity to remove nutrient. Direct monitoring and measurements of ecosystem evolution and changing velocities at every single part of the STA are not always feasible. Integrated remote sensing, monitoring, and modeling technique can be a state-of-the-art tool to...
Show moreMonitoring the heterogeneous aquatic environment such as the Stormwater Treatment Areas (STAs) located at the northeast of the Everglades is extremely important in understanding the land processes of the constructed wetland in its capacity to remove nutrient. Direct monitoring and measurements of ecosystem evolution and changing velocities at every single part of the STA are not always feasible. Integrated remote sensing, monitoring, and modeling technique can be a state-of-the-art tool to estimate the spatial and temporal distributions of flow velocity regimes and ecological functioning in such dynamic aquatic environments. In this presentation, comparison between four computational intelligence models including Extreme Learning Machine (ELM), Genetic Programming (GP) and Artificial Neural Network (ANN) models were organized to holistically assess the flow velocity and direction as well as ecosystem states within a vegetative wetland area. First the local sensor network was established using Acoustic Doppler Velocimeter (ADV). Utilizing the local sensor data along with the help of external driving forces parameters, trained models of ELM, GP and ANN were developed, calibrated, validated, and compared to select the best computational capacity of velocity prediction over time. Besides, seasonal images collected by French satellite Pleiades have been analyzed to address the seasonality effect of plant species evolution and biomass changes in the constructed wetland. The key finding of this research is to characterize the interactions between geophysical and geochemical processes in this wetland system based on ground-based monitoring sensors and satellite images to discover insight of hydraulic residence time, plant species variation, and water quality and improve the overall understanding of possible nutrient removal in this constructed wetland.
Show less - Date Issued
- 2014
- Identifier
- CFE0005533, ucf:52864
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005533
- Title
- ANALYSES OF CRASH OCCURENCE AND INURY SEVERITIES ON MULTI LANE HIGHWAYS USING MACHINE LEARNING ALGORITHMS.
- Creator
-
Das, Abhishek, Abdel-Aty, Mohamed A., University of Central Florida
- Abstract / Description
-
Reduction of crash occurrence on the various roadway locations (mid-block segments; signalized intersections; un-signalized intersections) and the mitigation of injury severity in the event of a crash are the major concerns of transportation safety engineers. Multi lane arterial roadways (excluding freeways and expressways) account for forty-three percent of fatal crashes in the state of Florida. Significant contributing causes fall under the broad categories of aggressive driver behavior;...
Show moreReduction of crash occurrence on the various roadway locations (mid-block segments; signalized intersections; un-signalized intersections) and the mitigation of injury severity in the event of a crash are the major concerns of transportation safety engineers. Multi lane arterial roadways (excluding freeways and expressways) account for forty-three percent of fatal crashes in the state of Florida. Significant contributing causes fall under the broad categories of aggressive driver behavior; adverse weather and environmental conditions; and roadway geometric and traffic factors. The objective of this research was the implementation of innovative, state-of-the-art analytical methods to identify the contributing factors for crashes and injury severity. Advances in computational methods render the use of modern statistical and machine learning algorithms. Even though most of the contributing factors are known a-priori, advanced methods unearth changing trends. Heuristic evolutionary processes such as genetic programming; sophisticated data mining methods like conditional inference tree; and mathematical treatments in the form of sensitivity analyses outline the major contributions in this research. Application of traditional statistical methods like simultaneous ordered probit models, identification and resolution of crash data problems are also key aspects of this study. In order to eliminate the use of unrealistic uniform intersection influence radius of 250 ft, heuristic rules were developed for assigning crashes to roadway segments, signalized intersection and access points using parameters, such as 'site location', 'traffic control' and node information. Use of Conditional Inference Forest instead of Classification and Regression Tree to identify variables of significance for injury severity analysis removed the bias towards the selection of continuous variable or variables with large number of categories. For the injury severity analysis of crashes on highways, the corridors were clustered into four optimum groups. The optimum number of clusters was found using Partitioning around Medoids algorithm. Concepts of evolutionary biology like crossover and mutation were implemented to develop models for classification and regression analyses based on the highest hit rate and minimum error rate, respectively. Low crossover rate and higher mutation reduces the chances of genetic drift and brings in novelty to the model development process. Annual daily traffic; friction coefficient of pavements; on-street parking; curbed medians; surface and shoulder widths; alcohol / drug usage are some of the significant factors that played a role in both crash occurrence and injury severities. Relative sensitivity analyses were used to identify the effect of continuous variables on the variation of crash counts. This study improved the understanding of the significant factors that could play an important role in designing better safety countermeasures on multi lane highways, and hence enhance their safety by reducing the frequency of crashes and severity of injuries. Educating young people about the abuses of alcohol and drugs specifically at high schools and colleges could potentially lead to lower driver aggression. Removal of on-street parking from high speed arterials unilaterally could result in likely drop in the number of crashes. Widening of shoulders could give greater maneuvering space for the drivers. Improving pavement conditions for better friction coefficient will lead to improved crash recovery. Addition of lanes to alleviate problems arising out of increased ADT and restriction of trucks to the slower right lanes on the highways would not only reduce the crash occurrences but also resulted in lower injury severity levels.
Show less - Date Issued
- 2009
- Identifier
- CFE0002928, ucf:48007
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0002928
- Title
- A COMPARATIVE ANALYSIS BETWEEN CONTEXT-BASED REASONING (CXBR) AND CONTEXTUAL GRAPHS (CXGS).
- Creator
-
Lorins, Peterson, Gonzalez, Avelino, University of Central Florida
- Abstract / Description
-
Context-based Reasoning (CxBR) and Contextual Graphs (CxGs) involve the modeling of human behavior in autonomous and decision-support situations in which optimal human decision-making is of utmost importance. Both formalisms use the notion of contexts to allow the implementation of intelligent agents equipped with a context sensitive knowledge base. However, CxBR uses a set of discrete contexts, implying that models created using CxBR operate within one context at a given time interval. CxGs...
Show moreContext-based Reasoning (CxBR) and Contextual Graphs (CxGs) involve the modeling of human behavior in autonomous and decision-support situations in which optimal human decision-making is of utmost importance. Both formalisms use the notion of contexts to allow the implementation of intelligent agents equipped with a context sensitive knowledge base. However, CxBR uses a set of discrete contexts, implying that models created using CxBR operate within one context at a given time interval. CxGs use a continuous context-based representation for a given problem-solving scenario for decision-support processes. Both formalisms use contexts dynamically by continuously changing between necessary contexts as needed in appropriate instances. This thesis identifies a synergy between these two formalisms by looking into their similarities and differences. It became clear during the research that each paradigm was designed with a very specific family of problems in mind. Thus, CXBR best implements models of autonomous agents in environment, while CxGs is best implemented in a decision support setting that requires the development of decision-making procedures. Cross applications were implemented on each and the results are discussed.
Show less - Date Issued
- 2005
- Identifier
- CFE0000577, ucf:46433
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000577
- Title
- CONTEXTUALIZING OBSERVATIONAL DATA FOR MODELING HUMAN PERFORMANCE.
- Creator
-
Trinh, Viet, Gonzalez, Avelino, University of Central Florida
- Abstract / Description
-
This research focuses on the ability to contextualize observed human behaviors in efforts to automate the process of tactical human performance modeling through learning from observations. This effort to contextualize human behavior is aimed at minimizing the role and involvement of the knowledge engineers required in building intelligent Context-based Reasoning (CxBR) agents. More specifically, the goal is to automatically discover the context in which a human actor is situated when...
Show moreThis research focuses on the ability to contextualize observed human behaviors in efforts to automate the process of tactical human performance modeling through learning from observations. This effort to contextualize human behavior is aimed at minimizing the role and involvement of the knowledge engineers required in building intelligent Context-based Reasoning (CxBR) agents. More specifically, the goal is to automatically discover the context in which a human actor is situated when performing a mission to facilitate the learning of such CxBR models. This research is derived from the contextualization problem left behind in Fernlund's research on using the Genetic Context Learner (GenCL) to model CxBR agents from observed human performance [Fernlund, 2004]. To accomplish the process of context discovery, this research proposes two contextualization algorithms: Contextualized Fuzzy ART (CFA) and Context Partitioning and Clustering (COPAC). The former is a more naive approach utilizing the well known Fuzzy ART strategy while the latter is a robust algorithm developed on the principles of CxBR. Using Fernlund's original five drivers, the CFA and COPAC algorithms were tested and evaluated on their ability to effectively contextualize each driver's individualized set of behaviors into well-formed and meaningful context bases as well as generating high-fidelity agents through the integration with Fernlund's GenCL algorithm. The resultant set of agents was able to capture and generalized each driver's individualized behaviors.
Show less - Date Issued
- 2009
- Identifier
- CFE0002563, ucf:48253
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0002563
- Title
- Integrated Data Fusion and Mining (IDFM) Technique for Monitoring Water Quality in Large and Small Lakes.
- Creator
-
Vannah, Benjamin, Chang, Ni-bin, Wanielista, Martin, Wang, Dingbao, University of Central Florida
- Abstract / Description
-
Monitoring water quality on a near-real-time basis to address water resources management and public health concerns in coupled natural systems and the built environment is by no means an easy task. Furthermore, this emerging societal challenge will continue to grow, due to the ever-increasing anthropogenic impacts upon surface waters. For example, urban growth and agricultural operations have led to an influx of nutrients into surface waters stimulating harmful algal bloom formation, and...
Show moreMonitoring water quality on a near-real-time basis to address water resources management and public health concerns in coupled natural systems and the built environment is by no means an easy task. Furthermore, this emerging societal challenge will continue to grow, due to the ever-increasing anthropogenic impacts upon surface waters. For example, urban growth and agricultural operations have led to an influx of nutrients into surface waters stimulating harmful algal bloom formation, and stormwater runoff from urban areas contributes to the accumulation of total organic carbon (TOC) in surface waters. TOC in surface waters is a known precursor of disinfection byproducts in drinking water treatment, and microcystin is a potent hepatotoxin produced by the bacteria Microcystis, which can form expansive algal blooms in eutrophied lakes. Due to the ecological impacts and human health hazards posed by TOC and microcystin, it is imperative that municipal decision makers and water treatment plant operators are equipped with a rapid and economical means to track and measure these substances.Remote sensing is an emergent solution for monitoring and measuring changes to the earth's environment. This technology allows for large regions anywhere on the globe to be observed on a frequent basis. This study demonstrates the prototype of a near-real-time early warning system using Integrated Data Fusion and Mining (IDFM) techniques with the aid of both multispectral (Landsat and MODIS) and hyperspectral (MERIS) satellite sensors to determine spatiotemporal distributions of TOC and microcystin. Landsat satellite imageries have high spatial resolution, but such application suffers from a long overpass interval of 16 days. On the other hand, free coarse resolution sensors with daily revisit times, such as MODIS, are incapable of providing detailed water quality information because of low spatial resolution. This issue can be resolved by using data or sensor fusion techniques, an instrumental part of IDFM, in which the high spatial resolution of Landsat and the high temporal resolution of MODIS imageries are fused and analyzed by a suite of regression models to optimally produce synthetic images with both high spatial and temporal resolutions. The same techniques are applied to the hyperspectral sensor MERIS with the aid of the MODIS ocean color bands to generate fused images with enhanced spatial, temporal, and spectral properties. The performance of the data mining models derived using fused hyperspectral and fused multispectral data are quantified using four statistical indices. The second task compared traditional two-band models against more powerful data mining models for TOC and microcystin prediction. The use of IDFM is illustrated for monitoring microcystin concentrations in Lake Erie (large lake), and it is applied for TOC monitoring in Harsha Lake (small lake). Analysis confirmed that data mining methods excelled beyond two-band models at accurately estimating TOC and microcystin concentrations in lakes, and the more detailed spectral reflectance data offered by hyperspectral sensors produced a noticeable increase in accuracy for the retrieval of water quality parameters.
Show less - Date Issued
- 2013
- Identifier
- CFE0005066, ucf:49979
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005066