Current Search: randomness (x)
View All Items
Pages
- Title
- BEHAVIOR OF VARIABLE-LENGTH GENETIC ALGORITHMS UNDER RANDOM SELECTION.
- Creator
-
Stringer, Harold, Wu, Annie, University of Central Florida
- Abstract / Description
-
In this work, we show how a variable-length genetic algorithm naturally evolves populations whose mean chromosome length grows shorter over time. A reduction in chromosome length occurs when selection is absent from the GA. Specifically, we divide the mating space into five distinct areas and provide a probabilistic and empirical analysis of the ability of matings in each area to produce children whose size is shorter than the parent generation's average size. Diversity of size within a...
Show moreIn this work, we show how a variable-length genetic algorithm naturally evolves populations whose mean chromosome length grows shorter over time. A reduction in chromosome length occurs when selection is absent from the GA. Specifically, we divide the mating space into five distinct areas and provide a probabilistic and empirical analysis of the ability of matings in each area to produce children whose size is shorter than the parent generation's average size. Diversity of size within a GA's population is shown to be a necessary condition for a reduction in mean chromosome length to take place. We show how a finite variable-length GA under random selection pressure uses 1) diversity of size within the population, 2) over-production of shorter than average individuals, and 3) the imperfect nature of random sampling during selection to naturally reduce the average size of individuals within a population from one generation to the next. In addition to our findings, this work provides GA researchers and practitioners with 1) a number of mathematical tools for analyzing possible size reductions for various matings and 2) new ideas to explore in the area of bloat control.
Show less - Date Issued
- 2007
- Identifier
- CFE0001652, ucf:47249
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001652
- Title
- EFFECTS OF POLARIZATION AND COHERENCE ON THE PROPAGATION AND THE DETECTION OF STOCHASTIC ELECTROMAGNETIC BEAMS.
- Creator
-
Salem, Mohamed, Rolland, Jannick, University of Central Florida
- Abstract / Description
-
Most of the physically realizable optical sources are radiating in a random manner given the random nature of the radiation of a large number of atoms that constitute the source. Besides, a lot of natural and synthetic materials are fluctuating randomly. Hence, the optical fields that one encounters, in most of the applications are fluctuating and must be treated using random or stochastic functions. Within the framework of the scalar-coherence theory, one can describe changes of the...
Show moreMost of the physically realizable optical sources are radiating in a random manner given the random nature of the radiation of a large number of atoms that constitute the source. Besides, a lot of natural and synthetic materials are fluctuating randomly. Hence, the optical fields that one encounters, in most of the applications are fluctuating and must be treated using random or stochastic functions. Within the framework of the scalar-coherence theory, one can describe changes of the properties of any stochastic field such as the spectral density and the spectral degree of coherence on propagation in any linear medium, deterministic or random. One of the frequently encountered random media is the atmospheric turbulence, where the fluctuating refractive index of such medium severely degrades any signal propagating through it; especially it causes intensity fades of the signal. The usage of stochastic beams at the transmitter instead of deterministic ones has been suggested sometime ago to suppress the severe effects of intensity fluctuations caused by the atmospheric turbulence. In this dissertation, we study the usage of partially coherent beams in long path propagation schemes through turbulent atmosphere such as one frequently encounters in remote sensing, in the use of communication systems, and in guiding. Also the used detection scheme at the receiver is important to quantify the received signal efficiently, hence we compare the performance of incoherent (direct) detection versus coherent (heterodyne) detection upon the use of either one of them at the receiver of the communication system of beams propagating in turbulent atmosphere and namely we evaluate the signal-to-noise-ratio (SNR) for each case. The scalar-coherence theory ignored the vector nature of stochastic fields, which should be taken into account for some applications such as the ones that depend on the change of the polarization of the field. Recently generalization for the scalar-coherence theory including the vector aspects of the stochastic beams has been formulated and it is well-known as the unified theory of coherence and polarization of stochastic beams. The use of the unified theory of coherence and polarization makes it possible to study both the coherence properties and the polarization properties of stochastic electromagnetic beams on propagation in any linear media. The central quantity in this theory is a 2 × 2 matrix that describes the statistical ensemble of any stochastic electromagnetic beam in the space-frequency domain or its Fourier transform in the space-time domain. In this dissertation we derive the conditions that the cross-spectral density matrix of a so-called planar, secondary, electromagnetic Gaussian Schell-model source has to satisfy in order to generate a beam propagating in vacuum. Also based on the unified-theory of coherence and polarization we investigate the subtle relationship between coherence and polarization under general circumstances. Besides we show the effects of turbulent atmosphere on the degree of polarization and the polarization state of a partially coherent electromagnetic beam, which propagates through it and we compare with the propagation in vacuum. The detection of the optical signals is important; hence it affects the fidelity of the communication system. In this dissertation we present a general analysis for the optical heterodyne detection of stochastic electromagnetic beams. We derive an expression for the SNR when two stochastic electromagnetic beams are mixed coherently on a detector surface in terms of the space-time domain representation of the beams, the beam coherence polarization matrices. We evaluate also the heterodyne efficiency of a heterodyne detection system for stochastic beams propagating in vacuum and we discuss the dependence of the heterodyne efficiency of the detection process on the changes in the beam parameters as the beam propagates in free space.
Show less - Date Issued
- 2007
- Identifier
- CFE0001932, ucf:47445
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001932
- Title
- Solution of linear ill-posed problems using overcomplete dictionaries.
- Creator
-
Gupta, Pawan, Pensky, Marianna, Swanson, Jason, Zhang, Teng, Foroosh, Hassan, University of Central Florida
- Abstract / Description
-
In this dissertation, we consider an application of overcomplete dictionaries to the solution of general ill-posed linear inverse problems. In the context of regression problems, there has been an enormous amount of effort to recover an unknown function using such dictionaries. While some research on the subject has been already carried out, there are still many gaps to address. In particular, one of the most popular methods, lasso, and its variants, is based on minimizing the empirical...
Show moreIn this dissertation, we consider an application of overcomplete dictionaries to the solution of general ill-posed linear inverse problems. In the context of regression problems, there has been an enormous amount of effort to recover an unknown function using such dictionaries. While some research on the subject has been already carried out, there are still many gaps to address. In particular, one of the most popular methods, lasso, and its variants, is based on minimizing the empirical likelihood and unfortunately, requires stringent assumptions on the dictionary, the so-called, compatibility conditions. Though compatibility conditions are hard to satisfy, it is well known that this can be accomplished by using random dictionaries. In the first part of the dissertation, we show how one can apply random dictionaries to the solution of ill-posed linear inverse problems with Gaussian noise. We put a theoretical foundation under the suggested methodology and study its performance via simulations and real-data example. In the second part of the dissertation, we investigate the application of lasso to the linear ill-posed problems with non-Gaussian noise. We have developed a theoretical background for the application of lasso to such problems and studied its performance via simulations.
Show less - Date Issued
- 2019
- Identifier
- CFE0007811, ucf:52345
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007811
- Title
- Gradient based MRF learning for image restoration and segmentation.
- Creator
-
Samuel, Kegan, Tappen, Marshall, Da Vitoria Lobo, Niels, Foroosh, Hassan, Li, Xin, University of Central Florida
- Abstract / Description
-
The undirected graphical model or Markov Random Field (MRF) is one of the more popular models used in computer vision and is the type of model with which this work is concerned. Models based on these methods have proven to be particularly useful in low-level vision systems and have led to state-of-the-art results for MRF-based systems. The research presented will describe a new discriminative training algorithm and its implementation.The MRF model will be trained by optimizing its parameters...
Show moreThe undirected graphical model or Markov Random Field (MRF) is one of the more popular models used in computer vision and is the type of model with which this work is concerned. Models based on these methods have proven to be particularly useful in low-level vision systems and have led to state-of-the-art results for MRF-based systems. The research presented will describe a new discriminative training algorithm and its implementation.The MRF model will be trained by optimizing its parameters so that the minimum energy solution of the model is as similar as possible to the ground-truth. While previous work has relied on time-consuming iterative approximations or stochastic approximations, this work will demonstrate how implicit differentiation can be used to analytically differentiate the overall training loss with respect to the MRF parameters. This framework leads to an efficient, flexible learning algorithm that can be applied to a number of different models.The effectiveness of the proposed learning method will then be demonstrated by learning the parameters of two related models applied to the task of denoising images. The experimental results will demonstrate that the proposed learning algorithm is comparable and, at times, better than previous training methods applied to the same tasks.A new segmentation model will also be introduced and trained using the proposed learning method. The proposed segmentation model is based on an energy minimization framework that is novel in how it incorporates priors on the size of the segments in a way that is straightforward to implement. While other methods, such as normalized cuts, tend to produce segmentations of similar sizes, this method is able to overcome that problem and produce more realistic segmentations.
Show less - Date Issued
- 2012
- Identifier
- CFE0004595, ucf:49207
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004595
- Title
- Mesoscale Light-Matter Interactions.
- Creator
-
Douglass, Kyle, Dogariu, Aristide, Abouraddy, Ayman, Hagan, David, Sugaya, Kiminobu, University of Central Florida
- Abstract / Description
-
Mesoscale optical phenomena occur when light interacts with a number of different types of materials, such as biological and chemical systems and fabricated nanostructures. As a framework, mesoscale optics unifies the interpretations of the interaction of light with complex media when the outcome depends significantly upon the scale of the interaction. Most importantly, it guides the process of designing an optical sensing technique by focusing on the nature and amount of information that can...
Show moreMesoscale optical phenomena occur when light interacts with a number of different types of materials, such as biological and chemical systems and fabricated nanostructures. As a framework, mesoscale optics unifies the interpretations of the interaction of light with complex media when the outcome depends significantly upon the scale of the interaction. Most importantly, it guides the process of designing an optical sensing technique by focusing on the nature and amount of information that can be extracted from a measurement.Different aspects of mesoscale optics are addressed in this dissertation which led to the solution of a number of problems in complex media. Dynamical and structural information from complex fluids(-)such as colloidal suspensions and biological fluids(-)was obtained by controlling the size of the interaction volume with low coherence interferometry. With this information, material properties such as particle sizes, optical transport coefficients, and viscoelastic characteristics of polymer solutions and blood were determined in natural, realistic conditions that are inaccessible to conventional techniques.The same framework also enabled the development of new, scale-dependent models for several important physical and biological systems. These models were then used to explain the results of some unique measurements. For example, the transport of light in disordered photonic lattices was interpreted as a scale-dependent, diffusive process to explain the anomalous behavior of photon path length distributions through these complex structures. In addition, it was demonstrated how specialized optical measurements and models at the mesoscale enable solutions to fundamental problems in cell biology. Specifically, it was found for the first time that the nature of cell motility changes markedly with the curvature of the substrate that the cellsivmove on. This particular work addresses increasingly important questions concerning the nature of cellular responses to external forces and the mechanical properties of their local environment.Besides sensing of properties and modeling behaviors of complex systems, mesoscale optics encompasses the control of material systems as a result of the light-matter interaction. Specific modifications to a material's structure can occur due to not only an exchange of energy between radiation and a material, but also due to a transfer of momentum. Based on the mechanical action of multiply scattered light on colloidal particles, an optically-controlled active medium that did not require specially tailored particles was demonstrated for the first time. The coupling between the particles and the random electromagnetic field affords new possibilities for controlling mesoscale systems and observing nonequilibrium thermodynamic phenomena.
Show less - Date Issued
- 2013
- Identifier
- CFE0004990, ucf:49606
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004990
- Title
- Remote Sensing of Coastal Wetlands: Long term vegetation stress assessment and data enhancement technique.
- Creator
-
Tahsin, Subrina, Medeiros, Stephen, Singh, Arvind, Mayo, Talea, University of Central Florida
- Abstract / Description
-
Apalachicola Bay in the Florida panhandle is home to a rich variety of salt water and freshwater wetlands but unfortunately is also subject to a wide range of hydrologic extreme events. Extreme hydrologic events such as hurricanes and droughts continuously threaten the area. The impact of hurricane and drought on both fresh and salt water wetlands was investigated over the time period from 2000 to 2015 in Apalachicola Bay using spatio-temporal changes in the Landsat based NDVI. Results...
Show moreApalachicola Bay in the Florida panhandle is home to a rich variety of salt water and freshwater wetlands but unfortunately is also subject to a wide range of hydrologic extreme events. Extreme hydrologic events such as hurricanes and droughts continuously threaten the area. The impact of hurricane and drought on both fresh and salt water wetlands was investigated over the time period from 2000 to 2015 in Apalachicola Bay using spatio-temporal changes in the Landsat based NDVI. Results indicate that salt water wetlands were more resilient than fresh water wetlands. Results also suggest that in response to hurricanes, the coastal wetlands took almost a year to recover while recovery following a drought period was observed after only a month. This analysis was successful and provided excellent insights into coastal wetland health. Such long term study is heavily dependent on optical sensor that is subject to data loss due to cloud coverage. Therefore, a novel method is proposed and demonstrated to recover the information contaminated by cloud. Cloud contamination is a hindrance to long-term environmental assessment using information derived from satellite imagery that retrieve data from visible and infrared spectral ranges. Normalized Difference Vegetation Index (NDVI) is a widely used index to monitor vegetation and land use change. NDVI can be retrieved from publicly available data repositories of optical sensors such as Landsat, Moderate Resolution Imaging Spectro-radiometer (MODIS) and several commercial satellites. Landsat has an ongoing high resolution NDVI record starting from 1984. Unfortunately, the time series NDVI data suffers from the cloud contamination issue. Though simple to complex computational methods for data interpolation have been applied to recover cloudy data, all the techniques are subject to many limitations. In this paper, a novel Optical Cloud Pixel Recovery (OCPR) method is proposed to repair cloudy pixels from the time-space-spectrum continuum with the aid of a machine learning tool, namely random forest (RF) trained and tested utilizing multi-parameter hydrologic data. The RF based OCPR model was compared with a simple linear regression (LR) based OCPR model to understand the potential of the model. A case study in Apalachicola Bay is presented to evaluate the performance of OCPR to repair cloudy NDVI reflectance for two specific dates. The RF based OCPR method achieves a root mean squared error of 0.0475 sr?1 between predicted and observed NDVI reflectance values. The LR based OCPR method achieves a root mean squared error of 0.1257 sr?1. Findings suggested that the RF based OCPR method is effective to repair cloudy values and provide continuous and quantitatively reliable imagery for further analysis in environmental applications.
Show less - Date Issued
- 2016
- Identifier
- CFE0006546, ucf:51331
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006546
- Title
- ROLLBACK-ABLE RANDOM NUMBER GENERATORS FOR THE SYNCHRONOUS PARALLEL ENVIRONMENT FOR EMULATION AND DISCRETE-EVENT SIMULATION (SPEEDES).
- Creator
-
Narayanan, Ramaswamy Karthik, Rabelo, Luis, University of Central Florida
- Abstract / Description
-
Random Numbers form the heart and soul of a discrete-event simulation system. There are few situations where the actions of the entities in the process being simulated can be completely predicted in advance. The real world processes are more probabilistic than deterministic. Hence, such chances are represented in the system by using various statistical models, like random number generators. These random number generators can be used to represent a various number of factors, such as length of...
Show moreRandom Numbers form the heart and soul of a discrete-event simulation system. There are few situations where the actions of the entities in the process being simulated can be completely predicted in advance. The real world processes are more probabilistic than deterministic. Hence, such chances are represented in the system by using various statistical models, like random number generators. These random number generators can be used to represent a various number of factors, such as length of the queue. However, simulations have grown in size and are sometimes required to run on multiple machines, which share the various methods or events in the simulation among themselves. These Machines can be distributed across a LAN or even the internet. In such cases, to keep the validity of the simulation model, we need rollback-able random number generators. This thesis is an effort to develop such rollback able random number generators for the Synchronous Parallel Environment for Emulation and Discrete-Event Simulation (SPEEDES) environment developed by NASA. These rollback-able random number generators will also add several statistical distribution models to the already rich SPEEDES library.
Show less - Date Issued
- 2005
- Identifier
- CFE0000328, ucf:46292
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000328
- Title
- USING LOW-COHERENCE INTERFEROMETRY TO MONITOR CELL INVASION IN AN IN-VITRO MODEL SYSTEM.
- Creator
-
Davoudi Nasab, Behnaz, Dogariu, Aristide, Andl, Claudia, University of Central Florida
- Abstract / Description
-
In an optically random system, such as naturally occurring and man-made media, light undergoes pronounced multiple scattering. This phenomenon has shown a remarkable potential in characterizing complex materials. In this regime, scattering occurs from each individual center of the scattering and independent scattering events lead to multiple light scattering. This phenomenon is often described as a random walk of photons and can be modeled in terms of a diffusion equation based on the...
Show moreIn an optically random system, such as naturally occurring and man-made media, light undergoes pronounced multiple scattering. This phenomenon has shown a remarkable potential in characterizing complex materials. In this regime, scattering occurs from each individual center of the scattering and independent scattering events lead to multiple light scattering. This phenomenon is often described as a random walk of photons and can be modeled in terms of a diffusion equation based on the radiative transfer theory. In this thesis, we used optical path-length spectroscopy (OPS), which is an experimental method to obtain the path-length probability density of the propagating light in multiple scattering media, with a low-coherence optical field to investigate the distribution of photon path lengths in a skin cell model system. This method is capable of measuring the transport mean free path of light in a highly scattering medium and depth-resolved profiles of the backscattered light. Our OPS experimental configuration is based on a fiber-optic Michelson interferometer geometry using single mode optical fibers. We performed OPS based on low-coherence interferometry (LCI) on three-dimensional organotypic models of esophageal cell invasion by measuring the optical path-length distribution of backscattered light in normal and invasive conditions. The optical path-length distribution of light waves inside the cell samples provides information on how a change in the extracellular matrix affects invasiveness of the esophageal cells and induction of signaling pathways. Also, we demonstrated the compatibility to study the structural changes during a two-week period for in vitro cell samples.
Show less - Date Issued
- 2017
- Identifier
- CFH2000219, ucf:45955
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFH2000219
- Title
- SIMULATION OF RANDOM SET COVERING PROBLEMS WITH KNOWN OPTIMAL SOLUTIONS AND EXPLICITLY INDUCED CORRELATIONS AMOONG COEFFICIENTS.
- Creator
-
Sapkota, Nabin, Reilly, Charles, University of Central Florida
- Abstract / Description
-
The objective of this research is to devise a procedure to generate random Set Covering Problem (SCP) instances with known optimal solutions and correlated coefficients. The procedure presented in this work can generate a virtually unlimited number of SCP instances with known optimal solutions and realistic characteristics, thereby facilitating testing of the performance of SCP heuristics and algorithms. A four-phase procedure based on the Karush-Kuhn-Tucker (KKT) conditions is proposed to...
Show moreThe objective of this research is to devise a procedure to generate random Set Covering Problem (SCP) instances with known optimal solutions and correlated coefficients. The procedure presented in this work can generate a virtually unlimited number of SCP instances with known optimal solutions and realistic characteristics, thereby facilitating testing of the performance of SCP heuristics and algorithms. A four-phase procedure based on the Karush-Kuhn-Tucker (KKT) conditions is proposed to generate SCP instances with known optimal solutions and correlated coefficients. Given randomly generated values for the objective function coefficients and the sum of the binary constraint coefficients for each variable and a randomly selected optimal solution, the procedure: (1) calculates the range for the number of possible constraints, (2) generates constraint coefficients for the variables with value one in the optimal solution, (3) assigns values to the dual variables, and (4) generates constraint coefficients for variables with value 0 in the optimal solution so that the KKT conditions are satisfied. A computational demonstration of the procedure is provided. A total of 525 SCP instances are simulated under seven correlation levels and three levels for the number of constraints. Each of these instances is solved using three simple heuristic procedures. The performance of the heuristics on the SCP instances generated is summarized and analyzed. The performance of the heuristics generally worsens as the expected correlation between the coefficients increases and as the number of constraints increases. The results provide strong evidence of the benefits of the procedure for generating SCP instances with correlated coefficients, and in particular SCP instances with known optimal solutions.
Show less - Date Issued
- 2006
- Identifier
- CFE0001416, ucf:47037
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001416
- Title
- CONTRIBUTIONS TO AUTOMATIC PARTICLE IDENTIFICATION IN ELECTRON MICROGRAPHS: ALGORITHMS, IMPLEMENTATION, AND APPLICATIONS.
- Creator
-
Singh, Vivek, Marinescu, Dan, University of Central Florida
- Abstract / Description
-
Three dimensional reconstruction of large macromolecules like viruses at resolutions below 8 \AA~ - 10 \AA~ requires a large set of projection images and the particle identification step becomes a bottleneck. Several automatic and semi-automatic particle detection algorithms have been developed along the years. We present a general technique designed to automatically identify the projection images of particles. The method utilizes Markov random field modelling of the projected images and...
Show moreThree dimensional reconstruction of large macromolecules like viruses at resolutions below 8 \AA~ - 10 \AA~ requires a large set of projection images and the particle identification step becomes a bottleneck. Several automatic and semi-automatic particle detection algorithms have been developed along the years. We present a general technique designed to automatically identify the projection images of particles. The method utilizes Markov random field modelling of the projected images and involves a preprocessing of electron micrographs followed by image segmentation and post processing for boxing of the particle projections. Due to the typically extensive computational requirements for extracting hundreds of thousands of particle projections, parallel processing becomes essential. We present parallel algorithms and load balancing schemes for our algorithms. The lack of a standard benchmark for relative performance analysis of particle identification algorithms has prompted us to develop a benchmark suite. Further, we present a collection of metrics for the relative performance analysis of particle identification algorithms on the micrograph images in the suite, and discuss the design of the benchmark suite.
Show less - Date Issued
- 2005
- Identifier
- CFE0000705, ucf:46610
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000705
- Title
- CONTROLLING RANDOMNESS: USING PROCEDURAL GENERATION TO INFLUENCE PLAYER UNCERTAINTY IN VIDEO GAMES.
- Creator
-
Fort, Travis, McDaniel, Rudy, University of Central Florida
- Abstract / Description
-
As video games increase in complexity and length, the use of automatic, or procedural, content generation has become a popular way to reduce the stress on game designers. However, the usage of procedural generation has certain consequences; in many instances, what the computer generates is uncertain to the designer. The intent of this thesis is to demonstrate how procedural generation can be used to intentionally affect the embedded randomness of a game system, enabling game designers to...
Show moreAs video games increase in complexity and length, the use of automatic, or procedural, content generation has become a popular way to reduce the stress on game designers. However, the usage of procedural generation has certain consequences; in many instances, what the computer generates is uncertain to the designer. The intent of this thesis is to demonstrate how procedural generation can be used to intentionally affect the embedded randomness of a game system, enabling game designers to influence the level of uncertainty a player experiences in a nuanced way. This control affords game designers direct control over complex problems like dynamic difficulty adjustment, pacing, or accessibility. Game design will be examined from the perspective of uncertainty and how procedural generation can be used to intentionally add or reduce uncertainty. Various procedural generation techniques will be discussed alongside the types of uncertainty, using case studies to demonstrate the principles in action.
Show less - Date Issued
- 2015
- Identifier
- CFH0004772, ucf:45386
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFH0004772
- Title
- RESPONSE SENSITIVITY OF HIGHWAY BRIDGES TO RANDOM MULTI-COMPONENT EARTHQUAKE EXCITATION.
- Creator
-
Cronin, Kyle, Mackie, Kevin, University of Central Florida
- Abstract / Description
-
Highway bridges provide a critical lifeline during extreme seismic events and must maintain serviceability under a large range of earthquake intensities. Consequently, the advent of more computational power has allowed more advanced analysis approaches for predicting performance and vulnerability of highway bridges under these seismic loads. In traditional two-dimensional finite element analyses, it has been demonstrated that the incidence angle of the ground motion can play a significant...
Show moreHighway bridges provide a critical lifeline during extreme seismic events and must maintain serviceability under a large range of earthquake intensities. Consequently, the advent of more computational power has allowed more advanced analysis approaches for predicting performance and vulnerability of highway bridges under these seismic loads. In traditional two-dimensional finite element analyses, it has been demonstrated that the incidence angle of the ground motion can play a significant role in structural response. As three-dimensional nonlinear time history analyses are used more frequently in practice, ground motions are still usually applied along a single bridge axis. It is unknown how three orthogonal components of ground motion excitation should be applied to the structure to best represent the true response. In this study, the fundamental behavior of three-dimensional ground motion was studied using single-degree-of-freedom elastic spectra. Mean spectra computed from various orientation techniques were found indistinguishable when the orthogonal components were combined. The effect of incidence angle on the nonlinear structural response of highway bridges was then investigated through extensive statistical simulation. Three different bridge models were employed for this study implementing a suite of 180 multi-component ground motion records of various magnitude-distance-soil bins. Probabilistic seismic demand models for various response parameters are presented comparing the effects of random incidence angle to that of recorded directions. Although there are instances where the angle of incidence can significantly amplify response, results indicated that incidence angle had negligible effect on average ensemble response. This is consistent with results from the spectral analysis, although existing literature has emphasized incidence angle as a significant parameter of multi-component analysis.
Show less - Date Issued
- 2009
- Identifier
- CFE0002933, ucf:47973
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0002933
- Title
- COHERENCE PROPERTIES OF OPTICAL NEAR-FIELDS.
- Creator
-
Apostol, Adela, Dogariu, Aristide, University of Central Florida
- Abstract / Description
-
Next generation photonics-based technologies will ultimately rely on novel materials and devices. For this purpose, phenomena at subwavelength scales are being studied to advance both fundamental knowledge and experimental capabilities. In this dissertation, concepts specific to near-field optics and experimental capabilities specific to near-field microscopy are used to investigate various aspects of the statistical properties of random electromagnetic fields in the vicinity of optically...
Show moreNext generation photonics-based technologies will ultimately rely on novel materials and devices. For this purpose, phenomena at subwavelength scales are being studied to advance both fundamental knowledge and experimental capabilities. In this dissertation, concepts specific to near-field optics and experimental capabilities specific to near-field microscopy are used to investigate various aspects of the statistical properties of random electromagnetic fields in the vicinity of optically inhomogeneous media which emit or scatter radiation. The properties of such fields are being characterized within the frame of the coherence theory. While successful in describing the far-field properties of optical fields, the fundamental results of the conventional coherence theory disregard the contribution of short-range evanescent waves. Nonetheless, the specific features of random fields at subwavelength distances from interfaces of real media are influenced by the presence of evanescent waves because, in this case, both propagating and nonpropagating components contribute to the detectable properties of the radiation. In our studies, we have fully accounted for both contributions and, as a result, different surface and subsurface characteristics of inhomogeneous media could be explored. We investigated different properties of random optical near-fields which exhibit either Gaussian or non-Gaussian statistics. We have demonstrated that characteristics of optical radiation such as first- and second-order statistics of intensity and the spectral density in the vicinity of random media are all determined by both evanescent waves contribution and the statistical properties of the physical interface. For instance, we quantified the subtle differences which exist between the near- and far-field spectra of radiation and we brought the first experimental evidence that, contrary to the predictions of the conventional coherence theory, the values of coherence length in the near field depend on the distance from the interface and, moreover, they can be smaller than the wavelength of light. The results included in this dissertation demonstrate that the statistical properties of the electromagnetic fields which exist in the close proximity of inhomogeneous media can be used to extract structural information. They also suggest the possibility to adjust the coherence properties of the emitted radiation by modifying the statistical properties of the interfaces. Understanding the random interference phenomena in the near-field could also lead to new possibilities for surface and subsurface diagnostics of inhomogeneous media. In addition, controlling the statistical properties of radiation at subwavelength scales should be of paramount importance in the design of miniaturized optical sources, detectors and sensors.
Show less - Date Issued
- 2005
- Identifier
- CFE0000408, ucf:46410
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000408
- Title
- ANALYZING THE COMMUNITY STRUCTURE OF WEB-LIKE NETWORKS: MODELS AND ALGORITHMS.
- Creator
-
Cami, Aurel, Deo, Narsingh, University of Central Florida
- Abstract / Description
-
This dissertation investigates the community structure of web-like networks (i.e., large, random, real-life networks such as the World Wide Web and the Internet). Recently, it has been shown that many such networks have a locally dense and globally sparse structure with certain small, dense subgraphs occurring much more frequently than they do in the classical Erdös-Rényi random graphs. This peculiarity--which is commonly referred to as community structure--has been observed in...
Show moreThis dissertation investigates the community structure of web-like networks (i.e., large, random, real-life networks such as the World Wide Web and the Internet). Recently, it has been shown that many such networks have a locally dense and globally sparse structure with certain small, dense subgraphs occurring much more frequently than they do in the classical Erdös-Rényi random graphs. This peculiarity--which is commonly referred to as community structure--has been observed in seemingly unrelated networks such as the Web, email networks, citation networks, biological networks, etc. The pervasiveness of this phenomenon has led many researchers to believe that such cohesive groups of nodes might represent meaningful entities. For example, in the Web such tightly-knit groups of nodes might represent pages with a common topic, geographical location, etc., while in the neural networks they might represent evolved computational units. The notion of community has emerged in an effort to formalize the empirical observation of the locally dense globally sparse structure of web-like networks. In the broadest sense, a community in a web-like network is defined as a group of nodes that induces a dense subgraph which is sparsely linked with the rest of the network. Due to a wide array of envisioned applications, ranging from crawlers and search engines to network security and network compression, there has recently been a widespread interest in finding efficient community-mining algorithms. In this dissertation, the community structure of web-like networks is investigated by a combination of analytical and computational techniques: First, we consider the problem of modeling the web-like networks. In the recent years, many new random graph models have been proposed to account for some recently discovered properties of web-like networks that distinguish them from the classical random graphs. The vast majority of these random graph models take into account only the addition of new nodes and edges. Yet, several empirical observations indicate that deletion of nodes and edges occurs frequently in web-like networks. Inspired by such observations, we propose and analyze two dynamic random graph models that combine node and edge addition with a uniform and a preferential deletion of nodes, respectively. In both cases, we find that the random graphs generated by such models follow power-law degree distributions (in agreement with the degree distribution of many web-like networks). Second, we analyze the expected density of certain small subgraphs--such as defensive alliances on three and four nodes--in various random graphs models. Our findings show that while in the binomial random graph the expected density of such subgraphs is very close to zero, in some dynamic random graph models it is much larger. These findings converge with our results obtained by computing the number of communities in some Web crawls. Next, we investigate the computational complexity of the community-mining problem under various definitions of community. Assuming the definition of community as a global defensive alliance, or a global offensive alliance we prove--using transformations from the dominating set problem--that finding optimal communities is an NP-complete problem. These and other similar complexity results coupled with the fact that many web-like networks are huge, indicate that it is unlikely that fast, exact sequential algorithms for mining communities may be found. To handle this difficulty we adopt an algorithmic definition of community and a simpler version of the community-mining problem, namely: find the largest community to which a given set of seed nodes belong. We propose several greedy algorithms for this problem: The first proposed algorithm starts out with a set of seed nodes--the initial community--and then repeatedly selects some nodes from community's neighborhood and pulls them in the community. In each step, the algorithm uses clustering coefficient--a parameter that measures the fraction of the neighbors of a node that are neighbors themselves--to decide which nodes from the neighborhood should be pulled in the community. This algorithm has time complexity of order , where denotes the number of nodes visited by the algorithm and is the maximum degree encountered. Thus, assuming a power-law degree distribution this algorithm is expected to run in near-linear time. The proposed algorithm achieved good accuracy when tested on some real and computer-generated networks: The fraction of community nodes classified correctly is generally above 80% and often above 90% . A second algorithm based on a generalized clustering coefficient, where not only the first neighborhood is taken into account but also the second, the third, etc., is also proposed. This algorithm achieves a better accuracy than the first one but also runs slower. Finally, a randomized version of the second algorithm which improves the time complexity without affecting the accuracy significantly, is proposed. The main target application of the proposed algorithms is focused crawling--the selective search for web pages that are relevant to a pre-defined topic.
Show less - Date Issued
- 2005
- Identifier
- CFE0000900, ucf:46726
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000900
- Title
- A deep learning approach to diagnosing schizophrenia.
- Creator
-
Barry, Justin, Valliyil Thankachan, Sharma, Gurupur, Varadraj, Jha, Sumit Kumar, Ewetz, Rickard, University of Central Florida
- Abstract / Description
-
In this article, the investigators present a new method using a deep learning approach to diagnose schizophrenia. In the experiment presented, the investigators have used a secondary dataset provided by National Institutes of Health. The aforementioned experimentation involves analyzing this dataset for existence of schizophrenia using traditional machine learning approaches such as logistic regression, support vector machine, and random forest. This is followed by application of deep...
Show moreIn this article, the investigators present a new method using a deep learning approach to diagnose schizophrenia. In the experiment presented, the investigators have used a secondary dataset provided by National Institutes of Health. The aforementioned experimentation involves analyzing this dataset for existence of schizophrenia using traditional machine learning approaches such as logistic regression, support vector machine, and random forest. This is followed by application of deep learning techniques using three hidden layers in the model. The results obtained indicate that deep learning provides state-of-the-art accuracy in diagnosing schizophrenia. Based on these observations, there is a possibility that deep learning may provide a paradigm shift in diagnosing schizophrenia.
Show less - Date Issued
- 2019
- Identifier
- CFE0007429, ucf:52737
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007429
- Title
- NEAR-FIELD OPTICAL INTERACTIONS AND APPLICATIONS.
- Creator
-
Haefner, David, Dogariu, Aristide, University of Central Florida
- Abstract / Description
-
The propagation symmetry of electromagnetic fields is affected by encounters with material systems. The effects of such interactions, for example, modifications of intensity, phase, polarization, angular spectrum, frequency, etc. can be used to obtain information about the material system. However, the propagation of electromagnetic waves imposes a fundamental limit to the length scales over which the material properties can be observed. In the realm of near-field optics, this limitation is...
Show moreThe propagation symmetry of electromagnetic fields is affected by encounters with material systems. The effects of such interactions, for example, modifications of intensity, phase, polarization, angular spectrum, frequency, etc. can be used to obtain information about the material system. However, the propagation of electromagnetic waves imposes a fundamental limit to the length scales over which the material properties can be observed. In the realm of near-field optics, this limitation is overcome only through a secondary interaction that couples the high-spatial-frequency (but non-propagating) field components to propagating waves that can be detected. The available information depends intrinsically on this secondary interaction, which constitutes the topic of this study. Quantitative measurements of material properties can be performed only by controlling the subtle characteristics of these processes. This dissertation discusses situations where the effects of near-field interactions can be (i) neglected in certain passive testing techniques, (ii) exploited for active probing of static or dynamic systems, or (iii) statistically isolated when considering optically inhomogeneous materials. This dissertation presents novel theoretical developments, experimental measurements, and numerical results that elucidate the vectorial aspects of the interaction between light and nano-structured material for use in sensing applications.
Show less - Date Issued
- 2010
- Identifier
- CFE0003095, ucf:48318
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0003095
- Title
- USING LANDSCAPE GENETICS TO ASSESS POPULATION CONNECTIVITY IN A HABITAT GENERALIST.
- Creator
-
Hether, Tyler, Hoffman, Eric, University of Central Florida
- Abstract / Description
-
Understanding the nature of genetic variation in natural populations is an underlying theme of population genetics. In recent years population genetics has benefited from the incorporation of landscape and environmental data into pre-existing models of isolation by distance (IBD) to elucidate features influencing spatial genetic variation. Many of these landscape genetics studies have focused on populations separated by discrete barriers (e.g., mountain ridges) or species with specific...
Show moreUnderstanding the nature of genetic variation in natural populations is an underlying theme of population genetics. In recent years population genetics has benefited from the incorporation of landscape and environmental data into pre-existing models of isolation by distance (IBD) to elucidate features influencing spatial genetic variation. Many of these landscape genetics studies have focused on populations separated by discrete barriers (e.g., mountain ridges) or species with specific habitat requirements (i.e., habitat specialists). One difficulty in using a landscape genetics approach for taxa with less stringent habitat requirements (i.e., generalists) is the lack of obvious barriers to gene flow and preference for specific habitats. My study attempts to fill this information gap to understand mechanisms underlying population subdivision in generalists, using the squirrel treefrog (Hyla squirella) and a system for classifying 'terrestrial ecological systems' (i.e. habitat types). I evaluate this dataset with microsatellite markers and a recently introduced method based on ensemble learning (Random Forest) to identify whether spatial distance, habitat types, or both have influenced genetic connectivity among 20 H. squirella populations. Next, I hierarchically subset the populations included in the analysis based on (1) genetic assignment tests and (2) Mantel correlograms to determine the relative role of spatial distance in shaping landscape genetic patterns. Assignment tests show evidence of two genetic clusters that separate populations in Florida's panhandle (Western cluster) from those in peninsular Florida and southern Georgia (Eastern cluster). Mantel correlograms suggest a patch size of approximately 150 km. Landscape genetic analyses at all three spatial scales yielded improved model fit relative to isolation by distance when including habitat types. A hierarchical effect was identified whereby the importance of spatial distance (km) was the strongest predictor of patterns of genetic differentiation above the scale of the genetic patch. Below the genetic patch, spatial distance was still an explanatory variable but was only approximately 30% as relevant as mesic flatwoods or upland oak hammocks. Thus, it appears that habitat types largely influence patterns of population genetic connectivity at local scales but the signal of IBD becomes the dominant driver of regional connectivity. My results highlight some habitats as highly relevant to increased genetic connectivity at all spatial scales (e.g., upland oak hammocks) while others show no association (e.g., silviculture) or scale specific associations (e.g., pastures only at global scales). Given these results it appears that treating habitat as a binary metric (suitable/non-suitable) may be overly simplistic for generalist species in which gene flow probably occurs in a spectrum of habitat suitability. The overall pattern of spatial genetic and landscape genetic structure identified here provides insight into the evolutionary history and patterns of population connectivity for H. squirella and improves our understanding of the role of matrix composition for habitat generalists.
Show less - Date Issued
- 2010
- Identifier
- CFE0003204, ucf:48580
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0003204
- Title
- Analysis of Driving Behavior at Expressway Toll Plazas using Driving Simulator.
- Creator
-
Saad, Moatz, Abdel-Aty, Mohamed, Eluru, Naveen, Lee, JaeYoung, University of Central Florida
- Abstract / Description
-
The objective of this study is to analyze the driving behavior at toll plazas by examining multiple scenarios using a driving simulator to study the effect of different options including different path decisions, various signs, arrow markings, traffic conditions, and extending auxiliary lanes before and after the toll plaza on the driving behavior. Also, this study focuses on investigating the effect of drivers' characteristics on the dangerous driving behavior (e.g. speed variation, sudden...
Show moreThe objective of this study is to analyze the driving behavior at toll plazas by examining multiple scenarios using a driving simulator to study the effect of different options including different path decisions, various signs, arrow markings, traffic conditions, and extending auxiliary lanes before and after the toll plaza on the driving behavior. Also, this study focuses on investigating the effect of drivers' characteristics on the dangerous driving behavior (e.g. speed variation, sudden lane change, drivers' confusion). Safety and efficiency are the fundamental goals that transportation engineering is always seeking for the design of highways. Transportation agencies have a crucial challenging task to accomplish traffic safety, particularly at the locations that have been identified as crash hotspots. In fact, toll plaza locations are one of the most critical and challenging areas that expressway agencies have to pay attention to because of the increasing traffic crashes over the past years near toll plazas.Drivers are required to make many decisions at expressway toll plazas which result in drivers' confusion, speed variation, and abrupt lane change maneuvers. These crucial decisions are mainly influenced by three reasons. First, the limited distance between toll plazas and the merging areas at the on-ramps before the toll plazas. In additional to the limited distance between toll plazas and the diverging areas after the toll plazas at the off-ramps. Second, it is also affected by the location and the configuration of signage and pavement markings. Third, drivers' decisions are affected by the different lane configurations and tolling systems that can cause drivers' confusion and stress. Nevertheless, limited studies have explored the factors that influence driving behavior and safety at toll plazas. There are three main systems of the toll plaza, the traditional mainline toll plaza (TMTP), the hybrid mainline toll plaza (HMTP), and the all-electronic toll collection (AETC). Recently, in order to improve the safety and the efficiency of the toll plazas, most of the traditional mainline toll plazas have been converted to the hybrid toll plazas or the all-electronic toll collection plazas. This study assessed driving behavior at a section, including a toll plaza on one of the main expressways in Central Florida. The toll plaza is located between a close on-ramp and a nearby off-ramp. Thus, these close distances have a significant effect on increasing driver's confusion and unexpected lane change before and after the toll plaza. Driving simulator experiments were used to study the driving behavior at, before and after the toll plaza. The details of the section and the plaza were accurately replicated in the simulator. In the driving simulator experiment, Seventy-two drivers with different age groups were participated. Subsequently, each driver performed three separate scenarios out of a total of twenty-four scenarios. Seven risk indicators were extracted from the driving simulator experiment data by using MATLAB software. These variables are average speed, standard deviation of speed, standard deviation of lane deviation, acceleration rate, standard deviation of acceleration (acceleration noise), deceleration rate, and standard deviation of deceleration (braking action variation). Moreover, various scenario variables were tested in the driving simulator including different paths, signage, pavement markings, traffic condition, and extending auxiliary lanes before and after the toll plaza. Divers' individual characteristics were collected from a questionnaire before the experiment. Also, drivers were filling a questionnaire after each scenario to check for simulator sickness or discomfort. Nine variables were extracted from the simulation questionnaire for representing individual characteristics including, age, gender, education level, annual income, crash experience, professional drivers, ETC-tag use, driving frequency, and novice international drivers. A series of mixed linear models with random effects to account for multiple observations from the same participant were developed to reveal the contributing factors that affect driving behavior at toll plazas. The results uncovered that all drivers who drove through the open road tolling (ORT) showed higher speed and lower speed variation, lane deviation, and acceleration noise than other drivers who navigate through the tollbooth. Also, the results revealed that providing adequate signage, and pavement markings are effective in reducing risky driving behavior at toll plazas. Drivers tend to drive with less lane deviation and acceleration noise before the toll plaza when installing arrow pavement markings. Adding dynamic message sign (DMS) at the on-ramp has a significant effect on reducing speed variation before the toll plaza. Likewise, removing the third overhead sign before the toll plaza has a considerable influence on reducing aggressive driving behavior before and after the toll plaza. This result may reflect drivers' desire to feel less confusion by excessive signs and markings. Third, extending auxiliary lanes with 660 feet (0.125 miles) before or after the toll plaza have an effect on increasing the average speed and reducing the lane deviation and the speed variation at and before the toll plaza. It also has an impact on increasing the acceleration noise and the braking action variation after the toll plaza. Finally, it was found that in congested conditions, participants drive with a lower speed variation and lane deviation before the toll plaza but with a higher acceleration noise after the toll plaza. On the other hand, understanding drivers' characteristics is particularly important for exploring their effect on risky driving behavior. Young drivers (18-25) and old drivers (older than 50 years) consistently showed a higher risk behavior than middle age drivers (35 to 50). Also, it was found that male drivers are riskier than female drivers at toll plazas. Drivers with high education level, drivers with high income, ETC-tag users, and drivers whose driving frequency is less than three trips per day are more cautious and tend to drive at a lower speed.
Show less - Date Issued
- 2016
- Identifier
- CFE0006492, ucf:51391
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006492
- Title
- ANALYSES OF CRASH OCCURENCE AND INURY SEVERITIES ON MULTI LANE HIGHWAYS USING MACHINE LEARNING ALGORITHMS.
- Creator
-
Das, Abhishek, Abdel-Aty, Mohamed A., University of Central Florida
- Abstract / Description
-
Reduction of crash occurrence on the various roadway locations (mid-block segments; signalized intersections; un-signalized intersections) and the mitigation of injury severity in the event of a crash are the major concerns of transportation safety engineers. Multi lane arterial roadways (excluding freeways and expressways) account for forty-three percent of fatal crashes in the state of Florida. Significant contributing causes fall under the broad categories of aggressive driver behavior;...
Show moreReduction of crash occurrence on the various roadway locations (mid-block segments; signalized intersections; un-signalized intersections) and the mitigation of injury severity in the event of a crash are the major concerns of transportation safety engineers. Multi lane arterial roadways (excluding freeways and expressways) account for forty-three percent of fatal crashes in the state of Florida. Significant contributing causes fall under the broad categories of aggressive driver behavior; adverse weather and environmental conditions; and roadway geometric and traffic factors. The objective of this research was the implementation of innovative, state-of-the-art analytical methods to identify the contributing factors for crashes and injury severity. Advances in computational methods render the use of modern statistical and machine learning algorithms. Even though most of the contributing factors are known a-priori, advanced methods unearth changing trends. Heuristic evolutionary processes such as genetic programming; sophisticated data mining methods like conditional inference tree; and mathematical treatments in the form of sensitivity analyses outline the major contributions in this research. Application of traditional statistical methods like simultaneous ordered probit models, identification and resolution of crash data problems are also key aspects of this study. In order to eliminate the use of unrealistic uniform intersection influence radius of 250 ft, heuristic rules were developed for assigning crashes to roadway segments, signalized intersection and access points using parameters, such as 'site location', 'traffic control' and node information. Use of Conditional Inference Forest instead of Classification and Regression Tree to identify variables of significance for injury severity analysis removed the bias towards the selection of continuous variable or variables with large number of categories. For the injury severity analysis of crashes on highways, the corridors were clustered into four optimum groups. The optimum number of clusters was found using Partitioning around Medoids algorithm. Concepts of evolutionary biology like crossover and mutation were implemented to develop models for classification and regression analyses based on the highest hit rate and minimum error rate, respectively. Low crossover rate and higher mutation reduces the chances of genetic drift and brings in novelty to the model development process. Annual daily traffic; friction coefficient of pavements; on-street parking; curbed medians; surface and shoulder widths; alcohol / drug usage are some of the significant factors that played a role in both crash occurrence and injury severities. Relative sensitivity analyses were used to identify the effect of continuous variables on the variation of crash counts. This study improved the understanding of the significant factors that could play an important role in designing better safety countermeasures on multi lane highways, and hence enhance their safety by reducing the frequency of crashes and severity of injuries. Educating young people about the abuses of alcohol and drugs specifically at high schools and colleges could potentially lead to lower driver aggression. Removal of on-street parking from high speed arterials unilaterally could result in likely drop in the number of crashes. Widening of shoulders could give greater maneuvering space for the drivers. Improving pavement conditions for better friction coefficient will lead to improved crash recovery. Addition of lanes to alleviate problems arising out of increased ADT and restriction of trucks to the slower right lanes on the highways would not only reduce the crash occurrences but also resulted in lower injury severity levels.
Show less - Date Issued
- 2009
- Identifier
- CFE0002928, ucf:48007
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0002928
- Title
- Applying Machine Learning Techniques to Analyze the Pedestrian and Bicycle Crashes at the Macroscopic Level.
- Creator
-
Rahman, Md Sharikur, Abdel-Aty, Mohamed, Eluru, Naveen, Hasan, Samiul, Yan, Xin, University of Central Florida
- Abstract / Description
-
This thesis presents different data mining/machine learning techniques to analyze the vulnerable road users' (i.e., pedestrian and bicycle) crashes by developing crash prediction models at macro-level. In this study, we developed data mining approach (i.e., decision tree regression (DTR) models) for both pedestrian and bicycle crash counts. To author knowledge, this is the first application of DTR models in the growing traffic safety literature at macro-level. The empirical analysis is based...
Show moreThis thesis presents different data mining/machine learning techniques to analyze the vulnerable road users' (i.e., pedestrian and bicycle) crashes by developing crash prediction models at macro-level. In this study, we developed data mining approach (i.e., decision tree regression (DTR) models) for both pedestrian and bicycle crash counts. To author knowledge, this is the first application of DTR models in the growing traffic safety literature at macro-level. The empirical analysis is based on the Statewide Traffic Analysis Zones (STAZ) level crash count data for both pedestrian and bicycle from the state of Florida for the year of 2010 to 2012. The model results highlight the most significant predictor variables for pedestrian and bicycle crash count in terms of three broad categories: traffic, roadway, and socio demographic characteristics. Furthermore, spatial predictor variables of neighboring STAZ were utilized along with the targeted STAZ variables in order to improve the prediction accuracy of both DTR models. The DTR model considering spatial predictor variables (spatial DTR model) were compared without considering spatial predictor variables (aspatial DTR model) and the models comparison results clearly found that spatial DTR model is superior model compared to aspatial DTR model in terms of prediction accuracy. Finally, this study contributed to the safety literature by applying three ensemble techniques (Bagging, Random Forest, and Boosting) in order to improve the prediction accuracy of weak learner (DTR models) for macro-level crash count. The model's estimation result revealed that all the ensemble technique performed better than the DTR model and the gradient boosting technique outperformed other competing ensemble technique in macro-level crash prediction model.
Show less - Date Issued
- 2018
- Identifier
- CFE0007358, ucf:52103
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007358