Current Search: Model (x)
Pages
-
-
Title
-
DEVELOPMENT OF A SIMPLIFIED FINITE ELEMENT APPROACH FOR FRP BRIDGE DECKS.
-
Creator
-
Vyas, Jignesh, Zhao, Lei, University of Central Florida
-
Abstract / Description
-
Moveable bridges in Florida typically use open steel grid decks due to the weight limitations. However, these decks present rideability, environmental, and maintenance problems, for they are typically less skid resistant than a solid riding surface, create loud noises, and allow debris to fall through the grids. Replacing open steel grid decks that are commonly used in moveable bridges with a low-profile FRP deck can improve rider safety and reduce maintenance costs, while satisfying the...
Show moreMoveable bridges in Florida typically use open steel grid decks due to the weight limitations. However, these decks present rideability, environmental, and maintenance problems, for they are typically less skid resistant than a solid riding surface, create loud noises, and allow debris to fall through the grids. Replacing open steel grid decks that are commonly used in moveable bridges with a low-profile FRP deck can improve rider safety and reduce maintenance costs, while satisfying the strict weight requirement for such bridges. The performance of the new deck system, which includes fatigue and failure tests were performed on full-size panels in a two-span configuration. The deck has successfully passed the preliminary strength and fatigue tests per AASHTO requirements. It has also demonstrated that it can be quickly installed and that its top plate bonds well with the wear surface. The thesis also describes the analytical investigation of a simplified finite element approach to simulate the load-deformation behavior of the deck system for both configurations. The finite element model may be used as a future design tool for similar deck systems. Loadings that were consistent with the actual experimental loadings were applied on the decks and the stresses, strains, and the displacements were monitored and studied. The results from the finite element model showed good correlation with the deflection and strain values measured during the experiments. A significant portion of the deck deflection under the prescribed loads is induced by vertical shear. This thesis presents the results from the experiments, descriptions of the finite element model and the comparison of the experimental results with the results from the analysis of the model.
Show less
-
Date Issued
-
2006
-
Identifier
-
CFE0001510, ucf:47151
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0001510
-
-
Title
-
EVALUATING THE IMPACT OF OOCEA'S DYNAMIC MESSAGE SIGNS (DMS) ON TRAVELERS' EXPERIENCE USING THE PRE-DEPLOYMENT SURVEY.
-
Creator
-
Rogers, John, Al-Deek, Haitham, University of Central Florida
-
Abstract / Description
-
The purpose of this thesis was to evaluate the impact of dynamic message signs (DMS) on the Orlando-Orange County Expressway Authority (OOCEA) toll road network using the Pre-Deployment DMS Survey (henceforth referred to as "pre-deployment survey"). DMS are electronic traffic signs used on roadways to give travelers information about travel times, traffic congestion, accidents, disabled vehicles, AMBER alerts, and special events. The particular DMS referred to in this study are large...
Show moreThe purpose of this thesis was to evaluate the impact of dynamic message signs (DMS) on the Orlando-Orange County Expressway Authority (OOCEA) toll road network using the Pre-Deployment DMS Survey (henceforth referred to as "pre-deployment survey"). DMS are electronic traffic signs used on roadways to give travelers information about travel times, traffic congestion, accidents, disabled vehicles, AMBER alerts, and special events. The particular DMS referred to in this study are large rectangular signs installed over the travel lanes and these are not the portable trailer mount signs. The OOCEA is currently in the process of adding several fixed DMS on their toll road network. Between January 2007 and February 2008, approximately 30 DMS are planned on their network. It is important to note that there was one DMS sign on the OOCEA network before this study started. Since most of the travelers on OOCEA toll roads are from Orange, Osceola and Seminole counties, this study is limited to these counties. This thesis documents the results of pre-deployment analysis. The instrument used to analyze the travelers' perception of DMS was a survey that utilized computer aided telephone interviews. The pre-deployment survey was conducted during early November of 2006. Questions pertaining to the acknowledgement of DMS on the OOCEA toll roads, satisfaction with travel information provided on the network, formatting of the messages, satisfaction with different types of messages, diversion questions (Revealed and Stated preferences), and classification/socioeconomic questions (such as age, education, most used toll road, and county of residence) were asked to the respondents. The results of the pre-deployment analysis showed that 54.4% of the OOCEA travelers recalled seeing DMS on the network. The respondents commonly agreed that the DMS are helpful for providing information about hazardous conditions, and that the DMS are easy to read. The majority of the travelers preferred DMS formats as a steady message for normal traffic conditions, and use of commonly recognized abbreviations such as I-Drive for International Drive. The results from the binary logit model for "satisfaction with travel information provided on OOCEA toll road network" display the significant variables that explain the likelihood of the traveler being satisfied. The results from the coefficients show that infrequent travelers are more likely to be satisfied with traveler information on OOCEA toll roads. In addition, the provision of hazard warnings, special event information, and accuracy of information on DMS are associated with higher levels of satisfaction with traveler information. The binary logit model for "Revealed Preference (RP)" diversion behavior showed that Seminole County travelers were likely to stay on the toll road, and SR 408 travelers were likely to divert off the toll road. The travelers who acknowledged DMS on the OOCEA network were also likely to divert off the toll road, but those who learned of the congestion by DMS were likely to stay on the toll road. Learning of congestion by DMS could encourage travelers to stay, since when they are on the toll roads, diversion at times could be difficult with no access to exits or little knowledge of alternate routes. But it is also possible that travelers stayed because their perception was that the toll roads are faster, especially when messages on DMS show travel times that confirm the travelers' belief. Travelers who were not satisfied with travel information on the network were more likely to divert off the toll road. The implications for implementation of these results are discussed in this thesis. DMS should be formatted as a steady message for normal traffic conditions. Commonly recognized abbreviations, such as I-Drive for International Drive, must be used for roadway identification when possible. DMS messages should be pertained to information on roadway hazards when necessary because it was found that travelers find it important to be informed on events that are related to their personal safety. Accuracy of information provided on DMS was important for traveler information satisfaction because if the travelers observe inaccurate travel times on DMS, they may not trust the validity of future messages. DMS information that led to the travelers canceling their intended stops led to a higher likelihood of them being dissatisfied with traveler information. It is important to meet the travelers' preferences and concerns for DMS.
Show less
-
Date Issued
-
2007
-
Identifier
-
CFE0001852, ucf:47374
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0001852
-
-
Title
-
AN ALGORITHM FOR DETERMINING SATELLITE ATTITUDE BY COMPARING PHYSICAL FEATURE MODELS TO EDGES DETECTED IN SATELLITE OR GROUND-BASED TELESCOPE IMAGERY.
-
Creator
-
Reinhart, Eric, Johnson, Roger, University of Central Florida
-
Abstract / Description
-
This thesis discusses the development and performance of an algorithm created to calculate satellite attitude based on the comparison of satellite "physical feature" models to information derived from edge detection performed on imagery of the satellite. The quality of this imagery could range from the very clear, close-up imagery that may come from an unmanned satellite servicing mission to the faint, unclear imagery that may come from a ground-based telescope investigating a satellite...
Show moreThis thesis discusses the development and performance of an algorithm created to calculate satellite attitude based on the comparison of satellite "physical feature" models to information derived from edge detection performed on imagery of the satellite. The quality of this imagery could range from the very clear, close-up imagery that may come from an unmanned satellite servicing mission to the faint, unclear imagery that may come from a ground-based telescope investigating a satellite anomaly. Satellite "physical feature" models describe where an edge is likely to appear in an image. These are usually defined by physical edges on the structure of the satellite or areas where there are distinct changes in material property. The theory behind this concept is discussed as well as two different approaches to implement it. Various simple examples are used to demonstrate the feasibility of the concept. These examples are well-controlled image simulations of simple physical models with known attitude. The algorithm attempts to perform the edge detection and edge registration of the simulated image and calculate the most likely attitude. Though complete autonomy was not achieved during this effort, the concept and approach show applicability.
Show less
-
Date Issued
-
2007
-
Identifier
-
CFE0001942, ucf:47450
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0001942
-
-
Title
-
Performance Predication Model for Advance Traffic Control System (ATCS) using field data.
-
Creator
-
Mirza, Masood, Radwan, Essam, Abou-Senna, Hatem, Abdel-Aty, Mohamed, Zheng, Qipeng, University of Central Florida
-
Abstract / Description
-
Reductions in capital expenditure revenues have created greater demands from users for quality service from existing facilities at lower costs forcing agencies to evaluate the performance of projects in more comprehensive and "greener" ways. The use of Adaptive Traffic Controls Systems (ATCS) is a step in the right direction by enabling practitioners and engineers to develop and implement traffic optimization strategies to achieve greater capacity out of the existing systems by optimizing...
Show moreReductions in capital expenditure revenues have created greater demands from users for quality service from existing facilities at lower costs forcing agencies to evaluate the performance of projects in more comprehensive and "greener" ways. The use of Adaptive Traffic Controls Systems (ATCS) is a step in the right direction by enabling practitioners and engineers to develop and implement traffic optimization strategies to achieve greater capacity out of the existing systems by optimizing traffic signal based on real time traffic demands and flow pattern. However, the industry is lagging in developing modeling tools for the ATCS which can predict the changes in MOEs due to the changes in traffic flow (i.e. volume and/or travel direction) making it difficult for the practitioners to measure the magnitude of the impacts and to develop an appropriate mitigation strategy. The impetus of this research was to explore the potential of utilizing available data from the ATCS for developing prediction models for the critical MOEs and for the entire intersection. Firstly, extensive data collections efforts were initiated to collect data from the intersections in Marion County, Florida. The data collected included volume, geometry, signal operations, and performance for an extended period. Secondly, the field data was scrubbed using macros to develop a clean data set for model development. Thirdly, the prediction models for the MOEs (wait time and queue) for the critical movements were developed using General Linear Regression Modeling techniques and were based on Poisson distribution with log linear function. Finally, the models were validated using the data collected from the intersections within Orange County, Florida. Also, as a part of this research, an Intersection Performance Index (IPI) model, a LOS prediction model for the entire intersection, was developed. This model was based on the MOEs (wait time and queue) for the critical movements.In addition, IPI Thresholds and corresponding intersection capacity designations were developed to establish level of service at the intersection. The IPI values and thresholds were developed on the same principles as Intersection Capacity Utilization (ICU) procedures, tested, and validated against corresponding ICU values and corresponding ICU LOS.
Show less
-
Date Issued
-
2018
-
Identifier
-
CFE0007055, ucf:51975
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0007055
-
-
Title
-
THE DEVELOPMENT OF A HUMAN-CENTRIC FUZZY MATHEMATICAL MEASURE OF HUMAN ENGAGEMENT IN INTERACTIVE MULTIMEDIA SYSTEMS AND APPLICATIONS.
-
Creator
-
Butler, Chandre, McCauley-Bush, Pamela, University of Central Florida
-
Abstract / Description
-
The utilization of fuzzy mathematical modeling for the quantification of the Human Engagement is an innovative approach within Interactive Multimedia applications (mainly video-based games designed to entertain or train participants on intended topics of interest) that can result in measurable and repeatable results. These results can then be used to generate a cogent Human Engagement definition. This research is designed to apply proven quantification techniques and Industrial/Systems...
Show moreThe utilization of fuzzy mathematical modeling for the quantification of the Human Engagement is an innovative approach within Interactive Multimedia applications (mainly video-based games designed to entertain or train participants on intended topics of interest) that can result in measurable and repeatable results. These results can then be used to generate a cogent Human Engagement definition. This research is designed to apply proven quantification techniques and Industrial/Systems Engineering methodologies to nontraditional environments such as Interactive Multimedia. The outcomes of this research will provide the foundation, initial steps and preliminary validation for the development of a systematic fuzzy theoretical model to be applied for the quantification of Human Engagement. Why is there a need for Interactive Multimedia applications in commercial and educational environments including K-20 educational systems and industry? In the latter case, the debate over education reform has drawn from referenced areas within the Industrial Engineering community including quality, continuous improvement, benchmarking and metrics development, data analysis, and scientific/systemic justification requirements. In spite of these applications, the literature does not reflect a consistent and broad application of these techniques in addressing the evaluation and quantification of Human Engagement in Interactive Multimedia. It is strongly believed that until an administrative based Human Engagement definition is created and accepted, the benefits of Interactive Multimedia may not be fully realized. The influence of gaming on society is quite apparent. For example, the increased governmental appropriations for Simulations & Modeling development as well as the estimated multi-billion dollar consumer PC/console game market are evidence of Interactive Multimedia opportunity. This body of work will identify factors that address the actual and perceived levels of Human Engagement in Interactive Multimedia systems and Virtual Environments and factor degrees of existence necessary to quantify and measure Human Engagement. Finally, the research will quantify the inputs and produce a model that provides a numeric value that defines the level of Human Engagement as it is evaluated within the interactive multimedia application area. This Human Engagement definition can then be used as the basis of study within other application areas of interest.
Show less
-
Date Issued
-
2010
-
Identifier
-
CFE0003380, ucf:48459
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0003380
-
-
Title
-
AN AUTOMATED METHODOLOGY FOR A COMPREHENSIVE DEFINITION OF THE SUPPLY CHAIN USING GENERIC ONTOLOGICAL COMPONENTS.
-
Creator
-
Fayez, Mohamed, Mollaghasemi, Mansooreh, University of Central Florida
-
Abstract / Description
-
Today, worldwide business communities are in the era of the Supply Chains. A Supply Chain is a collection of several independent enterprises that partner together to achieve specific goals. These enterprises may plan, source, produce, deliver, or transport materials to satisfy an immediate or projected market demand, and may provide the after sales support, warranty services, and returns. Each enterprise in the Supply Chain has roles and elements. The roles include supplier, customer, or...
Show moreToday, worldwide business communities are in the era of the Supply Chains. A Supply Chain is a collection of several independent enterprises that partner together to achieve specific goals. These enterprises may plan, source, produce, deliver, or transport materials to satisfy an immediate or projected market demand, and may provide the after sales support, warranty services, and returns. Each enterprise in the Supply Chain has roles and elements. The roles include supplier, customer, or carrier and the elements include functional units, processes, information, information resources, materials, objects, decisions, practices, and performance measures. Each enterprise, individually, manages these elements in addition to their flows, their interdependencies, and their complex interactions. Since a Supply Chain brings several enterprises together to complement each other to achieve a unified goal, the elements in each enterprise have to complement each other and have to be managed together as one unit to achieve the unified goal efficiently. Moreover, since there are a large number of elements to be defined and managed in a single enterprise, then the number of elements to be defined and managed when considering the whole Supply Chain is massive. The supply chain community is using the Supply Chain Operations Reference model (SCOR model) to define their supply chains. However, the SCOR model methodology is limited in defining the supply chain. The SCOR model defines the supply chain in terms of processes, performance metrics, and best practices. In fact, the supply chain community, SCOR users in particular, exerts massive effort to render an adequate supply chain definition that includes the other elements besides the elements covered in the SCOR model. Also, the SCOR model is delivered to the user in a document, which puts a tremendous burden on the user to use the model and makes it difficult to share the definition within the enterprise or across the supply chain. This research is directed towards overcoming the limitations and shortcomings of the current supply chain definition methodology. This research proposes a methodology and a tool that will enable an automated and comprehensive definition of the Supply Chain at any level of details. The proposed comprehensive definition methodology captures all the constituent parts of the Supply Chain at four different levels which are, the supply chain level, the enterprise level, the elements level, and the interaction level. At the Supply Chain level, the various enterprises that constitute the supply chain are defined. At the enterprise level, the enterprise elements are identified. At the enterprises' elements level, each element in the enterprise is explicitly defined. At the interaction level, the flows, interdependence, and interactions that exist between and within the other three levels are identified and defined. The methodology utilized several modeling techniques to generate generic explicit views and models that represents the four levels. The developed views and models were transformed to a series of questions and answers, where the questions correspond to what a view provides and the answers are the knowledge captured and generated from the view. The questions and answers were integrated to render a generic multi-view of the supply chain. The methodology and the multi-view were implemented in an ontology-based tool. The ontology includes sets of generic supply chain ontological components that represent the supply chain elements and a set of automated procedures that can be utilized to define a specific supply chain. A specific supply chain can be defined by re-using the generic components and customizing them to the supply chain specifics. The ontology-based tool was developed to function in the supply chain dynamic, information intensive, geographically dispersed, and heterogeneous environment. To that end, the tool was developed to be generic, sharable, automated, customizable, extensible, and scalable.
Show less
-
Date Issued
-
2005
-
Identifier
-
CFE0000399, ucf:46324
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0000399
-
-
Title
-
Biomechanical Models of Human Upper and Tracheal Airway Functionality.
-
Creator
-
Kuruppumullage, Don Nadun, Ilegbusi, Olusegun, Kassab, Alain, Moslehy, Faissal, Santhanam, Anand, Mansy, Hansen, Hoffman Ruddy, Bari, University of Central Florida
-
Abstract / Description
-
The respiratory tract, in other words, the airway, is the primary airflow path for several physiological activities such as coughing, breathing, and sneezing. Diseases can impact airway functionality through various means including cancer of the head and neck, Neurological disorders such as Parkinson's disease, and sleep disorders and all of which are considered in this study. In this dissertation, numerical modeling techniques were used to simulate three distinct airway diseases: a weak...
Show moreThe respiratory tract, in other words, the airway, is the primary airflow path for several physiological activities such as coughing, breathing, and sneezing. Diseases can impact airway functionality through various means including cancer of the head and neck, Neurological disorders such as Parkinson's disease, and sleep disorders and all of which are considered in this study. In this dissertation, numerical modeling techniques were used to simulate three distinct airway diseases: a weak cough leading to aspiration, upper airway patency in obstructive sleep apnea, and tongue cancer in swallow disorders. The work described in this dissertation, therefore, divided into three biomechanical models, of which fluid and particulate dynamics model of cough is the first. Cough is an airway protective mechanism, which results from a coordinated series of respiratory, laryngeal, and pharyngeal muscle activity. Patients with diminished upper airway protection often exhibit cough impairment resulting in aspiration pneumonia. Computational Fluid Dynamics (CFD) technique was used to simulate airflow and penetrant behavior in the airway geometry reconstructed from Computed Tomography (CT) images acquired from participants. The second study describes Obstructive Sleep Apnea (OSA) and the effects of dilator muscular activation on the human retro-lingual airway in OSA. Computations were performed for the inspiration stage of the breathing cycle, utilizing a fluid-structure interaction (FSI) method to couple structural deformation with airflow dynamics. The spatiotemporal deformation of the structures surrounding the airway wall was predicted and found to be in general agreement with observed changes in luminal opening and the distribution of airflow from upright to supine posture. The third study describes the effects of cancer of the tongue base on tongue motion during swallow. A three-dimensional biomechanical model was developed and used to calculate the spatiotemporal deformation of the tongue under a sequence of movements which simulate the oral stage of swallow.
Show less
-
Date Issued
-
2018
-
Identifier
-
CFE0007034, ucf:51986
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0007034
-
-
Title
-
Safety investigation of traffic crashes incorporating spatial correlation effects.
-
Creator
-
Alkahtani, Khalid, Abdel-Aty, Mohamed, Radwan, Essam, Eluru, Naveen, Lee, JaeYoung, Zheng, Qipeng, University of Central Florida
-
Abstract / Description
-
One main interest in crash frequency modeling is to predict crash counts over a spatial domain of interest (e.g., traffic analysis zones (TAZs)). The macro-level crash prediction models can assist transportation planners with a comprehensive perspective to consider safety in the long-range transportation planning process. Most of the previous studies that have examined traffic crashes at the macro-level are related to high-income countries, whereas there is a lack of similar studies among...
Show moreOne main interest in crash frequency modeling is to predict crash counts over a spatial domain of interest (e.g., traffic analysis zones (TAZs)). The macro-level crash prediction models can assist transportation planners with a comprehensive perspective to consider safety in the long-range transportation planning process. Most of the previous studies that have examined traffic crashes at the macro-level are related to high-income countries, whereas there is a lack of similar studies among lower- and middle-income countries where most road traffic deaths (90%) occur. This includes Middle Eastern countries, necessitating a thorough investigation and diagnosis of the issues and factors instigating traffic crashes in the region in order to reduce these serious traffic crashes. Since pedestrians are more vulnerable to traffic crashes compared to other road users, especially in this region, a safety investigation of pedestrian crashes is crucial to improving traffic safety. Riyadh, Saudi Arabia, which is one of the largest Middle East metropolises, is used as an example to reflect the representation of these countries' characteristics, where Saudi Arabia has a rather distinct situation in that it is considered a high-income country, and yet it has the highest rate of traffic fatalities compared to their high-income counterparts. Therefore, in this research, several statistical methods are used to investigate the association between traffic crash frequency and contributing factors of crash data, which are characterized by 1) geographical referencing (i.e., observed at specific locations) or spatially varying over geographic units when modeled; 2) correlation between different response variables (e.g., crash counts by severity or type levels); and 3) temporally correlated. A Bayesian multivariate spatial model is developed for predicting crash counts by severity and type. Therefore, based on the findings of this study, policy makers would be able to suggest appropriate safety countermeasures for each type of crash in each zone.
Show less
-
Date Issued
-
2018
-
Identifier
-
CFE0007148, ucf:52324
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0007148
-
-
Title
-
Data-Driven Modeling and Optimization of Building Energy Consumption.
-
Creator
-
Grover, Divas, Pourmohammadi Fallah, Yaser, Vosoughi, Azadeh, Zhou, Qun, University of Central Florida
-
Abstract / Description
-
Sustainability and reducing energy consumption are targets for building operations. The installation of smart sensors and Building Automation Systems (BAS) makes it possible to study facility operations under different circumstances. These technologies generate large amounts of data. That data can be scrapped and used for the analysis. In this thesis, we focus on the process of data-driven modeling and decision making from scraping the data to simulate the building and optimizing the...
Show moreSustainability and reducing energy consumption are targets for building operations. The installation of smart sensors and Building Automation Systems (BAS) makes it possible to study facility operations under different circumstances. These technologies generate large amounts of data. That data can be scrapped and used for the analysis. In this thesis, we focus on the process of data-driven modeling and decision making from scraping the data to simulate the building and optimizing the operation. The City of Orlando has similar goals of sustainability and reduction of energy consumption so, they provided us access to their BAS for the data and study the operation of its facilities. The data scraped from the City's BAS serves can be used to develop statistical/machine learning methods for decision making. We selected a mid-size pilot building to apply these techniques. The process begins with the collection of data from BAS. An Application Programming Interface (API) is developed to login to the servers and scrape data for all data points and store it on the local machine. Then data is cleaned to analyze and model. The dataset contains various data points ranging from indoor and outdoor temperature to fan speed inside the Air Handling Unit (AHU) which are operated by Variable Frequency Drive (VFD). This whole dataset is a time series and is handled accordingly. The cleaned dataset is analyzed to find different patterns and investigate relations between different data points. The analysis helps us in choosing parameters for models that are developed in the next step. Different statistical models are developed to simulate building and equipment behavior. Finally, the models along with the data are used to optimize the building Operation with the equipment constraints to make decisions for building operation which leads to a reduction in energy consumption while maintaining temperature and pressure inside the building.
Show less
-
Date Issued
-
2019
-
Identifier
-
CFE0007810, ucf:52335
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0007810
-
-
Title
-
AUTOMATIC GENERATION OF SUPPLY CHAIN SIMULATION MODELS FROM SCOR BASED ONTOLOGIES.
-
Creator
-
Cope, Dayana, Sepulveda, Jose, University of Central Florida
-
Abstract / Description
-
In today's economy of global markets, supply chain networks, supplier/customer relationship management and intense competition; decision makers are faced with a need to perform decision making using tools that do not accommodate the nature of the changing market. This research focuses on developing a methodology that addresses this need. The developed methodology provides supply chain decision makers with a tool to perform efficient decision making in stochastic, dynamic and distributed...
Show moreIn today's economy of global markets, supply chain networks, supplier/customer relationship management and intense competition; decision makers are faced with a need to perform decision making using tools that do not accommodate the nature of the changing market. This research focuses on developing a methodology that addresses this need. The developed methodology provides supply chain decision makers with a tool to perform efficient decision making in stochastic, dynamic and distributed supply chain environments. The integrated methodology allows for informed decision making in a fast, sharable and easy to use format. The methodology was implemented by developing a stand alone tool that allows users to define a supply chain simulation model using SCOR based ontologies. The ontology includes the supply chain knowledge and the knowledge required to build a simulation model of the supply chain system. A simulation model is generated automatically from the ontology to provide the flexibility to model at various levels of details changing the model structure on the fly. The methodology implementation is demonstrated and evaluated through a retail oriented case study. When comparing the implementation using the developed methodology vs. a "traditional" simulation methodology approach, a significant reduction in definition and execution time was observed.
Show less
-
Date Issued
-
2008
-
Identifier
-
CFE0002009, ucf:47625
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0002009
-
-
Title
-
AUTOMATED VISUAL DATABASE CREATION FOR A GROUND VEHICLE SIMULATOR.
-
Creator
-
Claudio, Pedro, Bauer, Christian, University of Central Florida
-
Abstract / Description
-
This research focuses on extracting road models from stereo video sequences taken from a moving vehicle. The proposed method combines color histogram based segmentation, active contours (snakes) and morphological processing to extract road boundary coordinates for conversion into Matlab or Multigen OpenFlight compatible polygonal representations. Color segmentation uses an initial truth frame to develop a color probability density function (PDF) of the road versus the terrain....
Show moreThis research focuses on extracting road models from stereo video sequences taken from a moving vehicle. The proposed method combines color histogram based segmentation, active contours (snakes) and morphological processing to extract road boundary coordinates for conversion into Matlab or Multigen OpenFlight compatible polygonal representations. Color segmentation uses an initial truth frame to develop a color probability density function (PDF) of the road versus the terrain. Subsequent frames are segmented using a Maximum Apostiori Probability (MAP) criteria and the resulting templates are used to update the PDFs. Color segmentation worked well where there was minimal shadowing and occlusion by other cars. A snake algorithm was used to find the road edges which were converted to 3D coordinates using stereo disparity and vehicle position information. The resulting 3D road models were accurate to within 1 meter.
Show less
-
Date Issued
-
2006
-
Identifier
-
CFE0001326, ucf:46994
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0001326
-
-
Title
-
A DYNAMIC MODEL OF THE HUMAN/COOLINGSYSTEM/CLOTHING/ENVIRONMENT SYSTEM.
-
Creator
-
pu, zhengxiang, Kapat, Jayanta, University of Central Florida
-
Abstract / Description
-
The human body compensates well for moderate climatic heat stress, but artificial environments often block or overwhelm physiological defense mechanism. Personal protective equipment (PPE) is one of sources of heat stress. It protects individual from chemical, physical, or biological hazards, but the high thermal insulation and low vapor permeability of PPE may also lead to substantial heat stress. Personal cooling is widely used to alleviate heat stress, especially for those situations where...
Show moreThe human body compensates well for moderate climatic heat stress, but artificial environments often block or overwhelm physiological defense mechanism. Personal protective equipment (PPE) is one of sources of heat stress. It protects individual from chemical, physical, or biological hazards, but the high thermal insulation and low vapor permeability of PPE may also lead to substantial heat stress. Personal cooling is widely used to alleviate heat stress, especially for those situations where ambient environmental cooling is not economically viable or feasible. It is important to predict the physiological responses of a person wearing PPE with personal cooling to make sure that the individual is free of heat stress, as well as any additional discomfort that may occur. Air temperature, radiant temperature, humidity and air movement are the four basic environmental parameters that affect human response to thermal environments. Combined with the personal parameters of metabolic heat generated by human activity and clothing worn by a person, they provide the six fundamental factors which define human thermal environments. If personal cooling system is available, the fluid flow speed, cooling tube distribution density and fluid inlet temperature have significant effects on the human thermal comfort. It is impractical to evaluate the problem experimentally due to too many factors involved. A thermal model was developed to improve human body thermal comfort prediction. The system researched includes human body, personal cooling system, clothing and environment. An existing model of thermoregulation is taken as a starting point. Changes and additions are made to provide better prediction. Personal cooling model was developed and it includes liquid cooling model, air cooling model and ice cooling model. Thermal resistance networks for the cooling system are built up; additionally a combined model of heat and mass transfer from cooling garment through clothing to environment is developed and incorporated into the personal cooling model and thermoregulatory model. The control volume method is employed to carry out the numerical calculation. An example simulation is presented for extra-vehicular activities on Mars. The simulation results agree well with available experimental data, though a small discrepancy between simulation results and experimental data is observed during the beginning of the cooling process. Compared with a water cooling lumped model, the thermal model provides a much better prediction. For water cooling, parametric study shows that the cooling water inlet temperature and liner thermal resistance have great effects on the maximum exposure time; PPE resistance and cooling water flow rate do not have much impact on the maximum exposure time. For air cooling, cooling air flow rate, inlet temperature, relative humidity and liner resistance have great effects on the maximum exposure time.
Show less
-
Date Issued
-
2005
-
Identifier
-
CFE0000416, ucf:46407
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0000416
-
-
Title
-
GAMMA HYDROXYBUTYRATE USE AMONG COLLEGE STUDENTS: APPLICATION OF A MEMORY MODEL TO EXPLORE THE INFLUENCE OF OUTCOME EXPECTANCIES.
-
Creator
-
Brown, Pamela, Dunn, Michael, University of Central Florida
-
Abstract / Description
-
Gamma Hydroxybutyrate (GHB) was banned from the consumer market by the Food and Drug Administration in 1991. Despite the ban, use of GHB has continued to contribute to thousands of emergency department visits and numerous fatalities in recent years. Efforts to reduce the use of this drug have had limited impact, which may be the result of using traditional prevention strategies that focus exclusively on educating people about of negative consequences of substance use rather than addressing...
Show moreGamma Hydroxybutyrate (GHB) was banned from the consumer market by the Food and Drug Administration in 1991. Despite the ban, use of GHB has continued to contribute to thousands of emergency department visits and numerous fatalities in recent years. Efforts to reduce the use of this drug have had limited impact, which may be the result of using traditional prevention strategies that focus exclusively on educating people about of negative consequences of substance use rather than addressing the factors that motivate use. In an effort to identify motivational factors that could be targeted in future prevention efforts, the present study was designed to examine outcome expectancies for GHB that may promote use of this drug. Methodology that has led to successful strategies to reduce alcohol use was applied to identify GHB expectancies and model cognitive processes likely to encourage or discourage GHB use. Individual differences scaling was used to empirically model a two dimensional semantic network of GHB expectancies stored in memory, and preference mapping was used to model likely paths of expectancy activation for male and female GHB users and nonusers. Differences in expectancies between GHB users and nonusers followed patterns previously identified in relation to alcohol expectancies and alcohol use. Conclusions were limited by relatively low numbers of GHB users in the sample, despite the use of a very large number of participants, overall. Despite this limitation these findings lay the groundwork for development and validation of GHB expectancy based prevention strategies.
Show less
-
Date Issued
-
2008
-
Identifier
-
CFE0002090, ucf:47538
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0002090
-
-
Title
-
USING COMPUTER SIMULATION MODELING TO EVALUATE THE BIOTERRORISMRESPONSE PLAN AT A LOCAL HOSPITAL FACILITY.
-
Creator
-
Bebber, Robert, Liberman, Aaron, University of Central Florida
-
Abstract / Description
-
The terrorist attacks of September 11th, 2001 and the subsequent anthrax mail attack have forced health care administrators and policy makers to place a new emphasis on disaster planning at hospital facilities--specifically bioterrorism planning. Yet how does one truly "prepare" for the unpredictable? In spite of accreditation requirements, which demand hospitals put in to place preparations to deal with bioterrorism events, a recent study from the General Accounting Office (GAO) concluded...
Show moreThe terrorist attacks of September 11th, 2001 and the subsequent anthrax mail attack have forced health care administrators and policy makers to place a new emphasis on disaster planning at hospital facilities--specifically bioterrorism planning. Yet how does one truly "prepare" for the unpredictable? In spite of accreditation requirements, which demand hospitals put in to place preparations to deal with bioterrorism events, a recent study from the General Accounting Office (GAO) concluded that most hospitals are still not capable of dealing with such threats (Gonzalez, 2004). This dissertation uses computer simulation modeling to test the effectiveness of bioterrorism planning at a local hospital facility in Central Florida, Winter Park Memorial Hospital. It is limited to the response plan developed by the hospital's Emergency Department. It evaluates the plan's effectiveness in dealing with an inhalational anthrax attack. Using Arena computer simulation software, and grounded within the theoretical framework of Complexity Science, we were able to test the effectiveness of the response plan in relation to Emergency Department bed capacity. Our results indicated that the response plan's flexibility was able to accommodate an increased patient load due to an attack, including an influx of the "worried well." Topics of future work and study are proposed.
Show less
-
Date Issued
-
2007
-
Identifier
-
CFE0001712, ucf:47293
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0001712
-
-
Title
-
MODELING AND PARTITIONING THE NUCLEOTIDE EVOLUTIONARY PROCESS FOR PHYLOGENETIC AND COMPARATIVE GENOMIC INFERENCE.
-
Creator
-
Castoe, Todd, Parkinson, Christopher, University of Central Florida
-
Abstract / Description
-
The transformation of genomic data into functionally relevant information about the composition of biological systems hinges critically on the field of computational genome biology, at the core of which lies comparative genomics. The aim of comparative genomics is to extract meaningful functional information from the differences and similarities observed across genomes of different organisms. We develop and test a novel framework for applying complex models of nucleotide evolution to solve...
Show moreThe transformation of genomic data into functionally relevant information about the composition of biological systems hinges critically on the field of computational genome biology, at the core of which lies comparative genomics. The aim of comparative genomics is to extract meaningful functional information from the differences and similarities observed across genomes of different organisms. We develop and test a novel framework for applying complex models of nucleotide evolution to solve phylogenetic and comparative genomic problems, and demonstrate that these techniques are crucial for accurate comparative evolutionary inferences. Additionally, we conduct an exploratory study using vertebrate mitochondrial genomes as a model to identify the reciprocal influences that genome structure, nucleotide evolution, and multi-level molecular function may have on one another. Collectively this work represents a significant and novel contribution to accurately modeling and characterizing patterns of nucleotide evolution, a contribution that enables the enhanced detection of patterns of genealogical relationships, selection, and function in comparative genomic datasets. Our work with entire mitochondrial genomes highlights a coordinated evolutionary shift that simultaneously altered genome architecture, replication, nucleotide evolution and molecular function (of proteins, RNAs, and the genome itself). Current research in computational biology, including the advances included in this dissertation, continue to close the gap that impedes the transformation of genomic data into powerful tools for the analysis and understanding of biological systems function.
Show less
-
Date Issued
-
2007
-
Identifier
-
CFE0001548, ucf:47138
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0001548
-
-
Title
-
PARAMETER ESTIMATION USING SENSOR FUSION AND MODEL UPDATING.
-
Creator
-
Francoforte, Kevin, Catbas, Necati, University of Central Florida
-
Abstract / Description
-
Engineers and infrastructure owners have to manage an aging civil infrastructure in the US. Engineers have the opportunity to analyze structures using finite element models (FEM), and often base their engineering decisions on the outcome of the results. Ultimately, the success of these decisions is directly related to the accuracy of the finite element model in representing the real-life structure. Improper assumptions in the model such as member properties or connections, can lead to...
Show moreEngineers and infrastructure owners have to manage an aging civil infrastructure in the US. Engineers have the opportunity to analyze structures using finite element models (FEM), and often base their engineering decisions on the outcome of the results. Ultimately, the success of these decisions is directly related to the accuracy of the finite element model in representing the real-life structure. Improper assumptions in the model such as member properties or connections, can lead to inaccurate results. A major source of modeling error in many finite element models of existing structures is due to improper representation of the boundary conditions. In this study, it is aimed to integrate experimental and analytical concepts by means of parameter estimation, whereby the boundary condition parameters of a structure in question are determined. FEM updating is a commonly used method to determine the "as-is" condition of an existing structure. Experimental testing of the structure using static and/or dynamic measurements can be utilized to update the unknown parameters. Optimization programs are used to update the unknown parameters by minimizing the error between the analytical and experimental measurements. Through parameter estimation, unknown parameters of the structure such as stiffness, mass or support conditions can be estimated, or more appropriately, "updated", so that the updated model provides for a better representation of the actual conditions of the system. In this study, a densely instrumented laboratory test beam was used to carry-out both analytical and experimental analysis of multiple boundary condition setups. The test beam was instrumented with an array of displacement transducers, tiltmeters and accelerometers. Linear vertical springs represented the unknown boundary stiffness parameters in the numerical model of the beam. Nine different load cases were performed and static measurements were used to update the spring stiffness, while dynamic measurements and additional load cases were used to verify these updated parameters. Two different optimization programs were used to update the unknown parameters and then the results were compared. One optimization tool was developed by the author, Spreadsheet Parameter Estimation (SPE), which utilized the Solver function found in the widely available Microsoft Excel software. The other one, comprehensive MATLAB-based PARameter Identification System (PARIS) software, was developed at Tufts University. Optimization results from the two programs are presented and discussed for different boundary condition setups in this thesis. For this purpose, finite element models were updated using the static data and then these models were checked against dynamic measurements for model validation. Model parameter updating provides excellent insight into the behavior of different boundary conditions and their effect on the overall structural behavior of the system. Updated FEM using estimated parameters from both optimization software programs generally shows promising results when compared to the experimental data sets. Although the use of SPE is simple and generally straight-forward, we will see the apparent limitations when dealing with complex, non-linear support conditions. Due to the inherent error associated with experimental measurements and FEM modeling assumptions, PARIS serves as a better suited tool to perform parameter estimation. Results from SPE can be used for quick analysis of structures, and can serve as initial inputs for the more in depth PARIS models. A number of different sensor types and spatial resolution were also investigated for the possible minimum instrumentation to have an acceptable model representation in terms of model and experimental data correlation.
Show less
-
Date Issued
-
2007
-
Identifier
-
CFE0001676, ucf:47206
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0001676
-
-
Title
-
FOUR TERMINAL JUNCTION FIELD-EFFECT TRANSISTOR MODEL FOR COMPUTER-AIDED DESIGN.
-
Creator
-
Ding, Hao, Liou, Juin J., University of Central Florida
-
Abstract / Description
-
A compact model for four-terminal (independent top and bottom gates) junction field-effect transistor (JFET) is presented in this dissertation. The model describes the steady-state characteristics with a unified equation for all bias conditions that provides a high degree of accuracy and continuity of conductance, which are important for predictive analog circuit simulations. It also includes capacitance and leakage equations. A special capacitance drop-off phenomenon at the pinch-off region...
Show moreA compact model for four-terminal (independent top and bottom gates) junction field-effect transistor (JFET) is presented in this dissertation. The model describes the steady-state characteristics with a unified equation for all bias conditions that provides a high degree of accuracy and continuity of conductance, which are important for predictive analog circuit simulations. It also includes capacitance and leakage equations. A special capacitance drop-off phenomenon at the pinch-off region is studies and modeled. The operations of the junction fieldeffect transistor (JFET) with an oxide top-gate and full oxide isolation are analyzed, and a semi-physical compact model is developed. The effects of the different modes associated with the oxide top-gate on the JFET steady-state characteristics of the transistor are discussed, and a single expression applicable for the description of the JFET dc characteristics for all operation modes is derived. The model has been implemented in Verilog-A and simulated in Cadence framework for comparison to experimental data measured at Texas Instruments.
Show less
-
Date Issued
-
2007
-
Identifier
-
CFE0001553, ucf:47144
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0001553
-
-
Title
-
AN INVESTIGATION OF SIZE EXCLUSION AND DIFFUSION CONTROLLED MEMBRANE FOULING.
-
Creator
-
Hobbs, Colin, Taylor, James, University of Central Florida
-
Abstract / Description
-
The reduction of membrane productivity (i.e. membrane fouling) during operation occurs in virtually all membrane applications. Membrane fouling originates from the method by which membranes operate: contaminants are rejected by the membrane and retained on the feed side of the membrane while treated water passes through the membrane. The accumulation of these contaminants on the feed side of the membrane results in increased operating pressures, increased backwashing frequencies, increased...
Show moreThe reduction of membrane productivity (i.e. membrane fouling) during operation occurs in virtually all membrane applications. Membrane fouling originates from the method by which membranes operate: contaminants are rejected by the membrane and retained on the feed side of the membrane while treated water passes through the membrane. The accumulation of these contaminants on the feed side of the membrane results in increased operating pressures, increased backwashing frequencies, increased chemical cleaning frequencies, and increased membrane replacement frequencies. The most significant practical implication of membrane fouling is increased operating and maintenance costs. As such, membrane fouling must be properly managed to ensure successful and efficient operation of membrane systems. This document presents four independent studies regarding the fouling of size exclusion and diffusion controlled membranes. A brief description of each study is presented below. The first study systematically investigated the fouling characteristics of various thin film composite polyamide reverse osmosis (RO) and nanofiltration (NF) membranes using a high organic surficial groundwater obtained from the City of Plantation, Florida. Prior to bench-scale fouling experiments, surface properties of the selected RO and NF membranes were carefully analysed in order to correlate the rate and extent of fouling to membrane surface characteristics, such as roughness, charge and hydrophobicity. More specifically, the surface roughness was characterized by atomic force microscopy, while the surface charge and hydrophobicity of the membranes were evaluated through zeta potential and contact angle measurements, respectively. The results indicated that membrane fouling became more severe with increasing surface roughness, as measured by the surface area difference, which accounts for both magnitude and frequency of surface peaks. Surface roughness was correlated to flux decline; however, surface charge was not. The limited range of hydrophobicity of the flat sheet studies prohibited conclusions regarding the correlation of flux decline and hydrophobicity. Mass loading and resistance models were developed in the second study to describe changes in solvent mass transfer (membrane productivity) over time of operation. Changes in the observed solvent mass transfer coefficient of four low pressure reverse osmosis membranes were correlated to feed water quality in a 2,000 hour pilot study. Independent variables utilized for model development included: temperature, initial solvent mass transfer coefficient, water loading, ultraviolet absorbance, turbidity, and monochloramine concentration. Models were generated by data collected throughout this study and were subsequently used to predict the solvent mass transfer coefficient. The sensitivity of each model with respect to monochloramine concentration was also analyzed. In the third study, mass loading and resistance models were generated to predict changes in solvent mass transfer (membrane productivity) with operating time for three reverse osmosis and nanofiltration membranes. Variations in the observed solvent mass transfer coefficient of these membranes treating filtered secondary effluent were correlated to the initial solvent mass transfer coefficient, temperature, and water loading in a 2,000 hour pilot study. Independent variables evaluated during model development included: temperature, initial solvent mass transfer coefficient, water loading, total dissolved solids, orthophosphorous, silica, total organic carbon, and turbidity. All models were generated by data collected throughout this study. Autopsies performed on membrane elements indicated membranes that received microfiltered water accumulated significantly more dissolved organic carbon and polysaccharides on their surface than membranes that received ultrafiltered water. Series of filtration experiments were systematically performed to investigate physical and chemical factors affecting the efficiency of backwashing during microfiltration of colloidal suspensions in the fourth study. Throughout this study, all experiments were conducted in dead-end filtration mode utilizing an outside-in, hollow-fiber module with a nominal pore size of 0.1 µm. Silica particles (mean diameter ~ 0.14 µm) were used as model colloids. Using a flux decline model based on the Happel's cell for the hydraulic resistance of the particle layer, the cake structure was determined from experimental fouling data and then correlated to backwash efficiency. Modeling of experimental data revealed no noticeable changes in cake layer structure when feed particle concentration and operating pressure increased. Specifically, the packing density of the cake layer (l-cake porosity) in the cake layer ranged from 0.66 to 0.67, which corresponds well to random packing density. However, the particle packing density increased drastically with ionic strength. The results of backwashing experiments demonstrated that the efficiency of backwashing decreased significantly with increasing solution ionic strength, while backwash efficiency did not vary when particle concentration and operating pressure increased. This finding suggests that backwash efficiency is closely related to the structure of the cake layer formed during particle filtration. More densely packed cake layers were formed under high ionic strength, and consequently less flux was recovered per given backwash volume during backwashing.
Show less
-
Date Issued
-
2007
-
Identifier
-
CFE0001854, ucf:47366
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0001854
-
-
Title
-
THE RESPONSE OF A GENERAL CIRCULATION CLIMATE MODEL TOHIGH LATITUDE FRESHWATER FORCING IN THE ATLANTIC BASINWITH RESPECT TOTROPICAL CYCLONE-LIKE VORTICES.
-
Creator
-
Paulis, Victor, Clarke, Thomas, University of Central Florida
-
Abstract / Description
-
The current cycle of climate change along with increases in hurricane activity, changing precipitation patterns, glacial melt, and other extremes of weather has led to interest and research into the global correlation or teleconnection between these events. Examination of historical climate records, proxies and observations is leading to formulation of hypotheses of climate dynamics with modeling and simulation being used to test these hypotheses as well as making projections. Ocean currents...
Show moreThe current cycle of climate change along with increases in hurricane activity, changing precipitation patterns, glacial melt, and other extremes of weather has led to interest and research into the global correlation or teleconnection between these events. Examination of historical climate records, proxies and observations is leading to formulation of hypotheses of climate dynamics with modeling and simulation being used to test these hypotheses as well as making projections. Ocean currents are believed to be an important factor in climate change with thermohaline circulation (THC) fluctuations being implicated in past cycles of abrupt change. Freshwater water discharge into high-latitude oceans attributed to changing precipitation patterns and glacial melt, particularly the North Atlantic, has also been associated with historical abrupt climate changes and is believed to have inhibited or shut down the THC overturning mechanism by diluting saline surface waters transported from the tropics. Here we analyze outputs of general circulation model (GCM) simulations parameterized by different levels of freshwater flux (no flux (control), 0.1 Sverdrup (Sv) and 1.0 Sv) with respect to tropical cyclone-like vortices (TCLVs) to determine any trend in simulated tropical storm frequency, duration, and location relative to flux level, as well as considering the applicability of using GCMs for tropical weather research. Increasing flux levels produced fewer storms and storm days, increased storm duration, a southerly and westerly shift (more pronounced for the 0.1 Sv level) in geographic distribution and increased activity near the African coast (more pronounced for the 1.0 Sv level). Storm intensities and tracks were not realistic compared to observational (real-life) values and is attributed to the GCM resolution not being fine enough to realistically simulate storm (microscale) dynamics.
Show less
-
Date Issued
-
2007
-
Identifier
-
CFE0001810, ucf:47339
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0001810
-
-
Title
-
CREATING GEO-SPECIFIC ROAD DATABASES FROM AERIAL PHOTOS FOR DRIVING SIMULATION.
-
Creator
-
Guo, Dahai, Klee, Harold, University of Central Florida
-
Abstract / Description
-
Geo-specific road database development is important to a driving simulation system and a very labor intensive process. Road databases for driving simulation need high resolution and accuracy. Even though commercial software is available on the market, a lot of manual work still has to be done when the road crosssectional profile is not uniform. This research deals with geo-specific road databases development, especially for roads with non-uniform cross sections. In this research, the United...
Show moreGeo-specific road database development is important to a driving simulation system and a very labor intensive process. Road databases for driving simulation need high resolution and accuracy. Even though commercial software is available on the market, a lot of manual work still has to be done when the road crosssectional profile is not uniform. This research deals with geo-specific road databases development, especially for roads with non-uniform cross sections. In this research, the United States Geographical Survey (USGS) road information is used with aerial photos to accurately extract road boundaries, using image segmentation and data compression techniques. Image segmentation plays an important role in extracting road boundary information. There are numerous methods developed for image segmentation. Six methods have been tried for the purpose of road image segmentation. The major problems with road segmentation are due to the large variety of road appearances and the many linear features in roads. A method that does not require a database of sample images is desired. Furthermore, this method should be able to handle the complexity of road appearances. The proposed method for road segmentation is based on the mean-shift clustering algorithm and it yields a high accuracy. In the phase of building road databases and visual databases based on road segmentation results, the Linde-Buzo-Gray (LBG) vector quantization algorithm is used to identify repeatable cross section profiles. In the phase of texture mapping, five major uniform textures are considered - pavement, white marker, yellow marker, concrete and grass. They are automatically mapped to polygons. In the chapter of results, snapshots of road/visual database are presented.
Show less
-
Date Issued
-
2005
-
Identifier
-
CFE0000591, ucf:46472
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0000591
Pages