Current Search: bayesian analysis (x)
View All Items
- Title
- Macroscopic Crash Analysis and Its Implications for Transportation Safety Planning.
- Creator
-
Siddiqui, Chowdhury, Abdel-Aty, Mohamed, Abdel-Aty, Mohamed, Uddin, Nizam, Huang, Helai, University of Central Florida
- Abstract / Description
-
Incorporating safety into the transportation planning stage, which is often termed as transportation safety planning (TSP), relies on the vital interplay between zone characteristics and zonal traffic crashes. Although a few safety studies had made some effort towards integrating safety and planning, several unresolved problems and a complete framework of TSP are still absent in the literature. This research aims at examining the suitability of the current traffic-related zoning planning...
Show moreIncorporating safety into the transportation planning stage, which is often termed as transportation safety planning (TSP), relies on the vital interplay between zone characteristics and zonal traffic crashes. Although a few safety studies had made some effort towards integrating safety and planning, several unresolved problems and a complete framework of TSP are still absent in the literature. This research aims at examining the suitability of the current traffic-related zoning planning process in a new suggested planning method which incorporates safety measures. In order to accomplish this broader research goal, the study defined its research objectives in the following directions towards establishing a framework of TSP- i) exploring the existing key determinants in traditional transportation planning (e.g., trip generation/distribution data, land use types, demographics, etc.) in order to develop an effective and efficient TSP framework, ii) investigation of the Modifiable Aerial Unit Problem (MAUP) in the context of macro-level crash modeling to investigate the effect of the zone's size and boundary, iii) understanding neighborhood influence of the crashes at or near zonal boundaries, and iv) development of crash-specific safety measure in the four-step transportation planning process.This research was conducted using spatial data from the counties of West Central Florida. Analysis of different crash data per spatial unit was performed using nonparametric approaches (e.g., data mining and random forest), classical statistical methods (e.g., negative binomial models), and Bayesian statistical techniques. In addition, a comprehensive Geographic Information System (GIS) based application tools were utilized for spatial data analysis and representation.Exploring the significant variables related to specific types of crashes is vital in the planning stages of a transportation network. This study identified and examined important variables associated with total crashes and severe crashes per traffic analysis zone (TAZ) by applying nonparametric statistical techniques using different trip related variables and road-traffic related factors. Since a macro-level analysis, by definition, will necessarily involve aggregating crashes per spatial unit, a spatial dependence or autocorrelation may arise if a particular variable of a geographic region is affected by the same variable of the neighboring regions. So far, few safety studies were performed to examine crashes at TAZs and none of them explicitly considered spatial effect of crashes occurring in them. In order to understand the clear picture of spatial autocorrelation of crashes, this study investigated the effect of spatial autocorrelation in modeling pedestrian and bicycle crashes in TAZs. Additionally, this study examined pedestrian crashes at Environmental Justice (EJ) TAZs which were identified in compliance with the various ongoing practices undertaken by Metropolitan Planning Organizations (MPOs) and previous research. Minority population and the low-income group are two important criteria based on which EJ areas are being identified. These unique areal characteristics have been of particular interest to the traffic safety analysts in order to investigate the contributing factors of pedestrian crashes in these deprived areas. Pedestrian and bicycle crashes were estimated as a function of variables related to roadway characteristics, and various demographic and socio-economic factors. It was found that significant differences are present between the predictor sets for pedestrian and bicycle crashes. In all cases the models with spatial correlation performed better than the models that did not account for spatial correlation among TAZs. This finding implied that spatial correlation should be considered while modeling pedestrian and bicycle crashes at the aggregate or macro-level. Also, the significance of spatial autocorrelation was later found in the total and severe crash analyses and accounted for in their respective modeling techniques.Since the study found affirmative evidence about the inclusion of spatial autocorrelation in the safety performance functions, this research considered identifying appropriate spatial entity based on which TSP framework would be developed. A wide array of spatial units has been explored in macro-level crash modeling in previous safety research. With the advancement of GIS, safety analysts are able to analyze crashes for various geographical units. However, a clear guideline on which geographic entity should a modeler choose is not present so far. This preference of spatial unit can vary with the dependent variable of the model. Or, for a specific dependent variable, models may be invariant to multiple spatial units by producing a similar goodness-of-fits. This problem is closely related to the Modifiable Areal Unit Problem which is a common issue in spatial data analysis. The study investigated three different crash (total, severe, and pedestrian) models developed for TAZs, block groups (BGs) and census tracts (CTs) using various roadway characteristics and census variables (e.g., land use, socio-economic, etc.); and compared them based on multiple goodness-of-fit measures.Based on MAD and MSPE it was evident that the total, severe and pedestrian crash models for TAZs and BGs had similar fits, and better than the ones developed for CTs. This indicated that the total, severe and pedestrian crash models are being affected by the size of the spatial units rather than their zoning configurations. So far, TAZs have been the base spatial units of analyses for developing travel demand models. Metropolitan planning organizations widely use TAZs in developing their long range transportation plans (LRTPs). Therefore, considering the practical application it was concluded that as a geographical unit, TAZs had a relative ascendancy over block group and census tract.Once TAZs were selected as the base spatial unit of the TSP framework, careful inspections on the TAZ delineations were performed. Traffic analysis zones are often delineated by the existing street network. This may result in considerable number of crashes on or near zonal boundaries. While the traditional macro-level crash modeling approach assigns zonal attributes to all crashes that occur within the zonal boundary, this research acknowledged the inaccuracy resulting from relating crashes on or near the boundary of the zone to merely the attributes of that zone. A novel approach was proposed to account for the spatial influence of the neighboring zones on crashes which specifically occur on or near the zonal boundaries. Predictive model for pedestrian crashes per zone were developed using a hierarchical Bayesian framework and utilized separate predictor sets for boundary and interior (non-boundary) crashes. It was found that these models (that account for boundary and interior crashes separately) had better goodness-of-fit measures compared to the models which had no specific consideration for crashes located at/near the zone boundaries. Additionally, the models were able to capture some unique predictors associated explicitly with interior and boundary-related crashes. For example, the variables- 'total roadway length with 35mph posted speed limit' and 'long term parking cost' were statistically not significantly different from zero in the interior crash model but they were significantly different from zero at the 95% level in the boundary crash model.Although an adjacent traffic analysis zones (a single layer) were defined for pedestrian crashes and boundary pedestrian crashes were modeled based on the characteristic factors of these adjacent zones, this was not considered reasonable for bicycle-related crashes as the average roaming area of bicyclists are usually greater than that of pedestrians. For smaller TAZs sometimes it is possible for a bicyclist to cross the entire TAZ. To account for this greater area of coverage, boundary bicycle crashes were modeled based on two layers of adjacent zones. As observed from the goodness-of-fit measures, performances of model considering single layer variables and model considering two layer variables were superior from the models that did not consider layering at all; but these models were comparable. Motor vehicle crashes (total and severe crashes) were classified as 'on-system' and 'off-system' crashes and two sub-models were fitted in order to calibrate the safety performance function for these crashes. On-system and off-system roads refer to two different roadway hierarchies. On-system or state maintained roads typically possess higher speed limit and carries traffic from distant TAZs. Off-system roads are, however, mostly local roads with relatively low speed limits. Due to these distinct characteristics, on-system crashes were modeled with only population and total employment variables of a zone in addition to the roadway and traffic variables; and all other zonal variables were disregarded. For off-system crashes, on contrary, all zonal variables was considered. It was evident by comparing this on- and off-system sub-model-framework to the other candidate models that it provided superior goodness-of-fit for both total and severe crashes.Based on the safety performance functions developed for pedestrian, bicycle, total and severe crashes, the study proposed a novel and complete framework for assessing safety (of these crash types) simultaneously in parallel with the four-step transportation planning process with no need of any additional data requirements from the practitioners' side.
Show less - Date Issued
- 2012
- Identifier
- CFE0004191, ucf:49009
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004191
- Title
- Integrating the macroscopic and microscopic traffic safety analysis using hierarchical models.
- Creator
-
Cai, Qing, Abdel-Aty, Mohamed, Eluru, Naveen, Hasan, Samiul, Lee, JaeYoung, Yan, Xin, University of Central Florida
- Abstract / Description
-
Crash frequency analysis is a crucial tool to investigate traffic safety problems. With the objective of revealing hazardous factors which would affect crash occurrence, crash frequency analysis has been undertaken at the macroscopic and microscopic levels. At the macroscopic level, crashes from a spatial aggregation (such as traffic analysis zone or county) are considered to quantify the impacts of socioeconomic and demographic characteristics, transportation demand and network attributes so...
Show moreCrash frequency analysis is a crucial tool to investigate traffic safety problems. With the objective of revealing hazardous factors which would affect crash occurrence, crash frequency analysis has been undertaken at the macroscopic and microscopic levels. At the macroscopic level, crashes from a spatial aggregation (such as traffic analysis zone or county) are considered to quantify the impacts of socioeconomic and demographic characteristics, transportation demand and network attributes so as to provide countermeasures from a planning perspective. On the other hand, the microscopic crashes on a segment or intersection are analyzed to identify the influence of geometric design, lighting and traffic flow characteristics with the objective of offering engineering solutions (such as installing sidewalk and bike lane, adding lighting). Although numerous traffic safety studies have been conducted, still there are critical limitations at both levels. In this dissertation, several methodologies have been proposed to alleviate several limitations in the macro- and micro-level safety research. Then, an innovative method has been suggested to analyze crashes at the two levels, simultaneously. At the macro-level, the viability of dual-state models (i.e., zero-inflated and hurdle models) were explored for traffic analysis zone based pedestrian and bicycle crash analysis. Additionally, spatial spillover effects were explored in the models by employing exogenous variables from neighboring zones. Both conventional single-state model (i.e., negative binomial) and dual-state models such as zero-inflated negative binomial and hurdle negative binomial models with and without spatial effects were developed. The model comparison results for pedestrian and bicycle crashes revealed that the models that considered observed spatial effects perform better than the models that did not consider the observed spatial effects. Across the models with spatial spillover effects, the dual-state models especially zero-inflated negative binomial model offered better performance compared to single-state models. Moreover, the model results clearly highlighted the importance of various traffic, roadway, and sociodemographic characteristics of the TAZ as well as neighboring TAZs on pedestrian and bicycle crash frequency. Then, the modifiable areal unit problem for macro-level crash analysis was discussed. Macro-level traffic safety analysis has been undertaken at different spatial configurations. However, clear guidelines for the appropriate zonal system selection for safety analysis are unavailable. In this study, a comparative analysis was conducted to determine the optimal zonal system for macroscopic crash modeling considering census tracts (CTs), traffic analysis zones (TAZs), and a newly developed traffic-related zone system labeled traffic analysis districts (TADs). Poisson lognormal models for three crash types (i.e., total, severe, and non-motorized mode crashes) were developed based on the three zonal systems without and with consideration of spatial autocorrelation. The study proposed a method to compare the modeling performance of the three types of geographic units at different spatial configuration through a grid based framework. Specifically, the study region was partitioned to grids of various sizes and the model prediction accuracy of the various macro models was considered within these grids of various sizes. These model comparison results for all crash types indicated that the models based on TADs consistently offer a better performance compared to the others. Besides, the models considering spatial autocorrelation outperformed the ones that do not consider it. Finally, based on the modeling results, it is recommended to adopt TADs for transportation safety planning.After determining the optimal traffic safety analysis zonal system, further analysis was conducted for non-motorist crashes (pedestrian and bicycle crashes). This study contributed to the literature on pedestrian and bicyclist safety by building on the conventional count regression models to explore exogenous factors affecting pedestrian and bicyclist crashes at the macroscopic level. In the traditional count models, effects of exogenous factors on non-motorist crashes were investigated directly. However, the vulnerable road users' crashes are collisions between vehicles and non-motorists. Thus, the exogenous factors can affect the non-motorist crashes through the non-motorists and vehicle drivers. To accommodate for the potentially different impact of exogenous factors we converted the non-motorist crash counts as the product of total crash counts and proportion of non-motorist crashes and formulated a joint model of the negative binomial (NB) model and the logit model to deal with the two parts, respectively. The formulated joint model was estimated using non-motorist crash data based on the Traffic Analysis Districts (TADs) in Florida. Meanwhile, the traditional NB model was also estimated and compared with the joint model. The results indicated that the joint model provides better data fit and could identify more significant variables. Subsequently, a novel joint screening method was suggested based on the proposed model to identify hot zones for non-motorist crashes. The hot zones of non-motorist crashes were identified and divided into three types: hot zones with more dangerous driving environment only, hot zones with more hazardous walking and cycling conditions only, and hot zones with both. At the microscopic level, crash modeling analysis was conducted for road facilities. This study, first, explored the potential macro-level effects which are always excluded or omitted in the previous studies. A Bayesian hierarchical model was proposed to analyze crashes on segments and intersection incorporating the macro-level data, which included both explanatory variables and total crashes of all segments and intersections. Besides, a joint modeling structure was adopted to consider the potentially spatial autocorrelation between segments and their connected intersections. The proposed model was compared with three other models: a model considering micro-level factors only, one hierarchical model considering macro-level effects with random terms only, and one hierarchical model considering macro-level effects with explanatory variables. The results indicated that models considering macro-level effects outperformed the model having micro-level factors only, which supports the idea to consider macro-level effects for micro-level crash analysis. Besides, the micro-level models were even further enhanced by the proposed model. Finally, significant spatial correlation could be found between segments and their adjacent intersections, supporting the employment of the joint modeling structure to analyze crashes at various types of road facilities. In addition to the separated analysis at either the macro- or micro-level, an integrated approach has been proposed to examine traffic safety problems at the two levels, simultaneously. If conducted in the same study area, the macro- and micro-level crash analyses should investigate the same crashes but aggregating the crashes at different levels. Hence, the crash counts at the two levels should be correlated and integrating macro- and micro-level crash frequency analyses in one modeling structure might have the ability to better explain crash occurrence by realizing the effects of both macro- and micro-level factors. This study proposed a Bayesian integrated spatial crash frequency model, which linked the crash counts of macro- and micro-levels based on the spatial interaction. In addition, the proposed model considered the spatial autocorrelation of different types of road facilities (i.e., segments and intersections) at the micro-level with a joint modeling structure. Two independent non-integrated models for macro- and micro-levels were also estimated separately and compared with the integrated model. The results indicated that the integrated model can provide better model performance for estimating macro- and micro-level crash counts, which validates the concept of integrating the models for the two levels. Also, the integrated model provides more valuable insights about the crash occurrence at the two levels by revealing both macro- and micro-level factors. Subsequently, a novel hotspot identification method was suggested, which enables us to detect hotspots for both macro- and micro-levels with comprehensive information from the two levels. It is expected that the proposed integrated model and hotspot identification method can help practitioners implement more reasonable transportation safety plans and more effective engineering treatments to proactively enhance safety.
Show less - Date Issued
- 2017
- Identifier
- CFE0006724, ucf:51891
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006724
- Title
- Selective Multivariate Applications in Forensic Science.
- Creator
-
Rinke, Caitlin, Sigman, Michael, Campiglia, Andres, Yestrebsky, Cherie, Kuebler, Stephen, Richardson, Martin, University of Central Florida
- Abstract / Description
-
A 2009 report published by the National Research Council addressed the need for improvements in the field of forensic science. In the report emphasis was placed on the need for more rigorous scientific analysis within many forensic science disciplines and for established limitations and determination of error rates from statistical analysis. This research focused on multivariate statistical techniques for the analysis of spectral data obtained for multiple forensic applications which include...
Show moreA 2009 report published by the National Research Council addressed the need for improvements in the field of forensic science. In the report emphasis was placed on the need for more rigorous scientific analysis within many forensic science disciplines and for established limitations and determination of error rates from statistical analysis. This research focused on multivariate statistical techniques for the analysis of spectral data obtained for multiple forensic applications which include samples from: automobile float glasses and paints, bones, metal transfers, ignitable liquids and fire debris, and organic compounds including explosives. The statistical techniques were used for two types of data analysis: classification and discrimination. Statistical methods including linear discriminant analysis and a novel soft classification method were used to provide classification of forensic samples based on a compiled library. The novel soft classification method combined three statistical steps: Principal Component Analysis (PCA), Target Factor Analysis (TFA), and Bayesian Decision Theory (BDT) to provide classification based on posterior probabilities of class membership. The posterior probabilities provide a statistical probability of classification which can aid a forensic analyst in reaching a conclusion. The second analytical approach applied nonparametric methods to provide the means for discrimination between samples. Nonparametric methods are performed as hypothesis test and do not assume normal distribution of the analytical figures of merit. The nonparametric permutation test was applied to forensic applications to determine the similarity between two samples and provide discrimination rates. Both the classification method and discrimination method were applied to data acquired from multiple instrumental methods. The instrumental methods included: Laser Induced-Breakdown Spectroscopy (LIBS), Fourier Transform Infrared Spectroscopy (FTIR), Raman spectroscopy, and Gas Chromatography-Mass Spectrometry (GC-MS). Some of these instrumental methods are currently applied to forensic applications, such as GC-MS for the analysis of ignitable liquid and fire debris samples; while others provide new instrumental methods to areas within forensic science which currently lack instrumental analysis techniques, such as LIBS for the analysis of metal transfers. The combination of the instrumental techniques and multivariate statistical techniques is investigated in new approaches to forensic applications in this research to assist in improving the field of forensic science.
Show less - Date Issued
- 2012
- Identifier
- CFE0004628, ucf:49942
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004628
- Title
- Reliable Spectrum Hole Detection in Spectrum-Heterogeneous Mobile Cognitive Radio Networks via Sequential Bayesian Non-parametric Clustering.
- Creator
-
Zaeemzadeh, Alireza, Rahnavard, Nazanin, Vosoughi, Azadeh, Qi, GuoJun, University of Central Florida
- Abstract / Description
-
In this work, the problem of detecting radio spectrum opportunities in spectrum-heterogeneous cognitive radio networks is addressed. Spectrum opportunities are the frequency channels that are underutilized by the primary licensed users. Thus, by enabling the unlicensed users to detect and utilize them, we can improve the efficiency, reliability, and the flexibility of the radio spectrum usage. The main objective of this work is to discover the spectrum opportunities in time, space, and...
Show moreIn this work, the problem of detecting radio spectrum opportunities in spectrum-heterogeneous cognitive radio networks is addressed. Spectrum opportunities are the frequency channels that are underutilized by the primary licensed users. Thus, by enabling the unlicensed users to detect and utilize them, we can improve the efficiency, reliability, and the flexibility of the radio spectrum usage. The main objective of this work is to discover the spectrum opportunities in time, space, and frequency domains, by proposing a low-cost and practical framework. Spectrum-heterogeneous networks are the networks in which different sensors experience different spectrum opportunities. Thus, the sensing data from sensors cannot be combined to reach consensus and to detect the spectrum opportunities. Moreover, unreliable data, caused by noise or malicious attacks, will deteriorate the performance of the decision-making process. The problem becomes even more challenging when the locations of the sensors are unknown. In this work, a probabilistic model is proposed to cluster the sensors based on their readings, not requiring any knowledge of location of the sensors. The complexity of the model, which is the number of clusters, is automatically inferred from the sensing data. The processing node, also referred to as the base station or the fusion center, infers the probability distributions of cluster memberships, channel availabilities, and devices' reliability in an online manner. After receiving each chunk of sensing data, the probability distributions are updated, without requiring to repeat the computations on previous sensing data. All the update rules are derived mathematically, by employing Bayesian data analysis techniques and variational inference.Furthermore, the inferred probability distributions are employed to assign unique spectrum opportunities to each of the sensors. To avoid interference among the sensors, physically adjacent devices should not utilize the same channels. However, since the location of the devices is not known, cluster membership information is used as a measure of adjacency. This is based on the assumption that the measurements of the devices are spatially correlated. Thus, adjacent devices, which experience similar spectrum opportunities, belong to the same cluster. Then, the problem is mapped into a energy minimization problem and solved via graph cuts. The goal of the proposed graph-theory-based method is to assign each device an available channel, while avoiding interference among neighboring devices. The numerical simulations illustrates the effectiveness of the proposed methods, compared to the existing frameworks.
Show less - Date Issued
- 2017
- Identifier
- CFE0006963, ucf:51639
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006963
- Title
- Real-time traffic safety evaluation models and their application for variable speed limits.
- Creator
-
Yu, Rongjie, Abdel-Aty, Mohamed, Radwan, Ahmed, Madani Larijani, Kaveh, Ahmed, Mohamed, Wang, Xuesong, University of Central Florida
- Abstract / Description
-
Traffic safety has become the first concern in the transportation area. Crashes have cause extensive human and economic losses. With the objective of reducing crash occurrence and alleviating crash injury severity, major efforts have been dedicated to reveal the hazardous factors that affect crash occurrence at both the aggregate (targeting crash frequency per segment, intersection, etc.,) and disaggregate levels (analyzing each crash event). The aggregate traffic safety studies, mainly...
Show moreTraffic safety has become the first concern in the transportation area. Crashes have cause extensive human and economic losses. With the objective of reducing crash occurrence and alleviating crash injury severity, major efforts have been dedicated to reveal the hazardous factors that affect crash occurrence at both the aggregate (targeting crash frequency per segment, intersection, etc.,) and disaggregate levels (analyzing each crash event). The aggregate traffic safety studies, mainly developing safety performance functions (SPFs), are being conducted for the purpose of unveiling crash contributing factors for the interest locations. Results of the aggregate traffic safety studies can be used to identify crash hot spots, calculate crash modification factors (CMF), and improve geometric characteristics. Aggregate analyses mainly focus on discovering the hazardous factors that are related to the frequency of total crashes, of specific crash type, or of each crash severity level. While disaggregate studies benefit from the reliable surveillance systems which provide detailed real-time traffic and weather data. This information could help in capturing microlevel influences of the hazardous factors which might lead to a crash. The disaggregate traffic safety models, also called real-time crash risk evaluation models, can be used in monitoring crash hazardousness with the real-time field data fed in. One potential use of real-time crash risk evaluation models is to develop Variable Speed Limits (VSL) as a part of a freeway management system. Models have been developed to predict crash occurrence to proactively improve traffic safety and prevent crash occurrence.In this study, first, aggregate safety performance functions were estimated to unveil the different risk factors affecting crash occurrence for a mountainous freeway section. Then disaggregate real-time crash risk evaluation models have been developed for the total crashes with both the machine learning and hierarchical Bayesian models. Considering the need for analyzing both aggregate and disaggregate aspects of traffic safety, systematic multi-level traffic safety studies have been conducted for single- and multi-vehicle crashes, and weekday and weekend crashes. Finally, the feasibility of utilizing a VSL system to improve traffic safety on freeways has been investigated. This research was conducted based on data obtained from a 15-mile mountainous freeway section on I-70 in Colorado. The data contain historical crash data, roadway geometric characteristics, real-time weather data, and real-time traffic data. Real-time weather data were recorded by 6 weather stations installed along the freeway section, while the real-time traffic data were obtained from the Remote Traffic Microwave Sensor (RTMS) radars and Automatic Vechicle Identification (AVI) systems. Different datasets have been formulated from various data sources, and prepared for the multi-level traffic safety studies. In the aggregate traffic safety investigation, safety performance functions were developed to identify crash occurrence hazardous factors. For the first time real-time weather and traffic data were used in SPFs. Ordinary Poisson model and random effects Poisson models with Bayesian inference approach were employed to reveal the effects of weather and traffic related variables on crash occurrence. Two scenarios were considered: one seasonal based case and one crash type based case. Deviance Information Criterion (DIC) was utilized as the comparison criterion; and the correlated random effects Poisson models outperform the others. Results indicate that weather condition variables, especially precipitation, play a key role in the safety performance functions. Moreover, in order to compare with the correlated random effects Poisson model, Multivariate Poisson model and Multivariate Poisson-lognormal model have been estimated. Conclusions indicate that, instead of assuming identical random effects for the homogenous segments, considering the correlation effects between two count variables would result in better model fit. Results from the aggregate analyses shed light on the policy implication to reduce crash frequencies. For the studied roadway segment, crash occurrence in the snow season have clear trends associated with adverse weather situations (bad visibility and large amount of precipitation); weather warning systems can be employed to improve road safety during the snow season. Furthermore, different traffic management strategies should be developed according to the distinct seasonal influence factors. In particular, sites with steep slopes need more attention from the traffic management center and operators especially during snow seasons to control the excess crash occurrence. Moreover, distinct strategy of freeway management should be designed to address the differences between single- and multi-vehicle crash characteristics.In addition to developing safety performance functions with various modeling techniques, this study also investigates four different approaches of developing informative priors for the independent variables. Bayesian inference framework provides a complete and coherent way to balance the empirical data and prior expectations; merits of these informative priors have been tested along with two types of Bayesian hierarchical models (Poisson-gamma and Poisson-lognormal models). Deviance Information Criterion, R-square values, and coefficients of variance for the estimations were utilized as evaluation measures to select the best model(s). Comparisons across the models indicate that the Poisson-gamma model is superior with a better model fit and it is much more robust with the informative priors. Moreover, the two-stage Bayesian updating informative priors provided the best goodness-of-fit and coefficient estimation accuracies.In addition to the aggregate analyses, real-time crash risk evaluation models have been developed to identify crash contributing factors at the disaggregate level. Support Vector Machine (SVM), a recently proposed statistical learning model and Hierarchical Bayesian logistic regression models were introduced to evaluate real-time crash risk. Classification and regression tree (CART) model has been developed to select the most important explanatory variables. Based on the variable selection results, Bayesian logistic regression models and SVM models with different kernel functions have been developed. Model comparisons based on receiver operating curves (ROC) demonstrate that the SVM model with Radial basis kernel function outperforms the others. Results from the models demonstrated that crashes are likely to happen during congestion periods (especially when the queuing area has propagated from the downstream segment); high variation of occupancy and/or volume would increase the probability of crash occurrence.Moreover, effects of microscopic traffic, weather, and roadway geometric factors on the occurrence of specific crash types have been investigated. Crashes have been categorized as rear-end, sideswipe, and single-vehicle crashes. AVI segment average speed, real-time weather data, and roadway geometric characteristics data were utilized as explanatory variables. Conclusions from this study imply that different active traffic management (ATM) strategies should be designed for three- and two-lane roadway sections and also considering the seasonal effects. Based on the abovementioned results, real-time crash risk evaluation models have been developed separately for multi-vehicle and single-vehicle crashes, and weekday and weekend crashes. Hierarchical Bayesian logistic regression models (random effects and random parameter logistic regression models) have been introduced to address the seasonal variations, crash unit level's diversities, and unobserved heterogeneity caused by geometric characteristics. For the multi-vehicle crashes: congested conditions at downstream would contribute to an increase in the likelihood of multi-vehicle crashes; multi-vehicle crashes are more likely to occur during poor visibility conditions and if there is a turbulent area that exists downstream. Drivers who are unable to reduce their speeds timely are prone to causing rear-end crashes. While for the single-vehicle crashes: slow moving traffic platoons at the downstream detector of the crash occurrence locations would increase the probability of single-vehicle crashes; large variations of occupancy downstream would also increase the likelihood of single-vehicle crash occurrence.Substantial efforts have been dedicated to revealing the hazardous factors that affect crash occurrence from both the aggregate and disaggregate level in this study, however, findings and conclusions from these research work need to be transferred into applications for roadway design and freeway management. This study further investigates the feasibility of utilizing Variable Speed Limits (VSL) system, one key part of ATM, to improve traffic safety on freeways. A proactive traffic safety improvement VSL control algorithm has been proposed. First, an extension of the traffic flow model METANET was employed to predict traffic flow while considering VSL's impacts on the flow-density diagram; a real-time crash risk evaluation model was then estimated for the purpose of quantifying crash risk; finally, the optimal VSL control strategies were achieved by employing an optimization technique of minimizing the total predicted crash risks along the VSL implementation area. Constraints were set up to limit the increase of the average travel time and differences between posted speed limits temporarily and spatially. The proposed VSL control strategy was tested for a mountainous freeway bottleneck area in the microscopic simulation software VISSIM. Safety impacts of the VSL system were quantified as crash risk improvements and speed homogeneity improvements. Moreover, three different driver compliance levels were modeled in VISSIM to monitor the sensitivity of VSL's safety impacts on driver compliance levels. Conclusions demonstrate that the proposed VSL system could effectively improve traffic safety by decreasing crash risk, enhancing speed homogeneity, and reducing travel time under both high and moderate driver compliance levels; while the VSL system does not have significant effects on traffic safety enhancement under the low compliance scenario. Future implementations of VSL control strategies and related research topics were also discussed.
Show less - Date Issued
- 2013
- Identifier
- CFE0005283, ucf:50556
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005283
- Title
- HISTORICAL RESPONSES OF MARINE TURTLES TO GLOBAL CLIMATE CHANGE AND JUVENILE LOGGERHEAD RECRUITMENT IN FLORIDA.
- Creator
-
Reece, Joshua, Parkinson, Christopher, University of Central Florida
- Abstract / Description
-
Marine turtle conservation is most successful when it is based on sound data incorporating life history, historical population stability, and gene flow among populations. This research attempts to provide that information through two studies. In chapter I, I identify historical patterns of gene flow, population sizes, and contraction/expansion during major climatic shifts. In chapter II, I reveal a life history characteristic of loggerhead turtles previously undocumented. I identify a pattern...
Show moreMarine turtle conservation is most successful when it is based on sound data incorporating life history, historical population stability, and gene flow among populations. This research attempts to provide that information through two studies. In chapter I, I identify historical patterns of gene flow, population sizes, and contraction/expansion during major climatic shifts. In chapter II, I reveal a life history characteristic of loggerhead turtles previously undocumented. I identify a pattern of juvenile recruitment to foraging grounds proximal to their natal nesting beach. This pattern results in a predictable recruitment pattern from juvenile foraging ground aggregations to local rookeries. This research will provide crucial information to conservation managers by demonstrating how sensitive marine turtles are to global climate change. In the second component of my research, I demonstrate how threats posed to juvenile foraging grounds will have measurable effects on rookeries proximal to those foraging grounds. The addition of this basic life history information will have dramatic effects on marine turtle conservation in the future, and will serve as the basis for more thorough, forward-looking recovery plans.
Show less - Date Issued
- 2005
- Identifier
- CFE0000341, ucf:46281
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000341
- Title
- SINBAD AUTOMATION OF SCIENTIFIC PROCESS: FROM HIDDEN FACTOR ANALYSIS TO THEORY SYNTHESIS.
- Creator
-
KURSUN, OLCAY, Favorov, Oleg V., University of Central Florida
- Abstract / Description
-
Modern science is turning to progressively more complex and data-rich subjects, which challenges the existing methods of data analysis and interpretation. Consequently, there is a pressing need for development of ever more powerful methods of extracting order from complex data and for automation of all steps of the scientific process. Virtual Scientist is a set of computational procedures that automate the method of inductive inference to derive a theory from observational data dominated by...
Show moreModern science is turning to progressively more complex and data-rich subjects, which challenges the existing methods of data analysis and interpretation. Consequently, there is a pressing need for development of ever more powerful methods of extracting order from complex data and for automation of all steps of the scientific process. Virtual Scientist is a set of computational procedures that automate the method of inductive inference to derive a theory from observational data dominated by nonlinear regularities. The procedures utilize SINBAD a novel computational method of nonlinear factor analysis that is based on the principle of maximization of mutual information among non-overlapping sources (Imax), yielding higher-order features of the data that reveal hidden causal factors controlling the observed phenomena. One major advantage of this approach is that it is not dependent on a particular choice of learning algorithm to use for the computations. The procedures build a theory of the studied subject by finding inferentially useful hidden factors, learning interdependencies among its variables, reconstructing its functional organization, and describing it by a concise graph of inferential relations among its variables. The graph is a quantitative model of the studied subject, capable of performing elaborate deductive inferences and explaining behaviors of the observed variables by behaviors of other such variables and discovered hidden factors. The set of Virtual Scientist procedures is a powerful analytical and theory-building tool designed to be used in research of complex scientific problems characterized by multivariate and nonlinear relations.
Show less - Date Issued
- 2004
- Identifier
- CFE0000043, ucf:46124
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000043
- Title
- Safety investigation of traffic crashes incorporating spatial correlation effects.
- Creator
-
Alkahtani, Khalid, Abdel-Aty, Mohamed, Radwan, Essam, Eluru, Naveen, Lee, JaeYoung, Zheng, Qipeng, University of Central Florida
- Abstract / Description
-
One main interest in crash frequency modeling is to predict crash counts over a spatial domain of interest (e.g., traffic analysis zones (TAZs)). The macro-level crash prediction models can assist transportation planners with a comprehensive perspective to consider safety in the long-range transportation planning process. Most of the previous studies that have examined traffic crashes at the macro-level are related to high-income countries, whereas there is a lack of similar studies among...
Show moreOne main interest in crash frequency modeling is to predict crash counts over a spatial domain of interest (e.g., traffic analysis zones (TAZs)). The macro-level crash prediction models can assist transportation planners with a comprehensive perspective to consider safety in the long-range transportation planning process. Most of the previous studies that have examined traffic crashes at the macro-level are related to high-income countries, whereas there is a lack of similar studies among lower- and middle-income countries where most road traffic deaths (90%) occur. This includes Middle Eastern countries, necessitating a thorough investigation and diagnosis of the issues and factors instigating traffic crashes in the region in order to reduce these serious traffic crashes. Since pedestrians are more vulnerable to traffic crashes compared to other road users, especially in this region, a safety investigation of pedestrian crashes is crucial to improving traffic safety. Riyadh, Saudi Arabia, which is one of the largest Middle East metropolises, is used as an example to reflect the representation of these countries' characteristics, where Saudi Arabia has a rather distinct situation in that it is considered a high-income country, and yet it has the highest rate of traffic fatalities compared to their high-income counterparts. Therefore, in this research, several statistical methods are used to investigate the association between traffic crash frequency and contributing factors of crash data, which are characterized by 1) geographical referencing (i.e., observed at specific locations) or spatially varying over geographic units when modeled; 2) correlation between different response variables (e.g., crash counts by severity or type levels); and 3) temporally correlated. A Bayesian multivariate spatial model is developed for predicting crash counts by severity and type. Therefore, based on the findings of this study, policy makers would be able to suggest appropriate safety countermeasures for each type of crash in each zone.
Show less - Date Issued
- 2018
- Identifier
- CFE0007148, ucf:52324
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007148
- Title
- A GIS SAFETY STUDY AND A COUNTY-LEVEL SPATIAL ANALYSIS OF CRASHES IN THE STATE OF FLORIDA.
- Creator
-
Darwiche, Ali, Abdel-Aty, Mohamed, University of Central Florida
- Abstract / Description
-
The research conducted in this thesis consists of a Geographic Information Systems (GIS) based safety study and a spatial analysis of vehicle crashes in the State of Florida. The GIS safety study is comprised of a County and Roadway Level GIS analysis of multilane corridors. The spatial analysis investigated the use of county-level vehicle crash models, taking spatial effects into account. The GIS safety study examines the locations of high trends of severe crashes (includes incapacitating...
Show moreThe research conducted in this thesis consists of a Geographic Information Systems (GIS) based safety study and a spatial analysis of vehicle crashes in the State of Florida. The GIS safety study is comprised of a County and Roadway Level GIS analysis of multilane corridors. The spatial analysis investigated the use of county-level vehicle crash models, taking spatial effects into account. The GIS safety study examines the locations of high trends of severe crashes (includes incapacitating and fatal crashes) on multilane corridors in the State of Florida at two levels, county level and roadway level. The GIS tool, which is used frequently in traffic safety research, was utilized to visually display those locations. At the county level, several maps of crash trends were generated. It was found that counties with high population and large metropolitan areas tend to have more crash occurrences. It was also found that the most severe crashes occurred in counties with more urban than rural roads. The neighboring counties of Pasco, Pinellas and Hillsborough had high severe crash rate per mile. At the roadway level, seven counties were chosen for the analysis based on their high severe crash trends, metropolitan size and geographical location. Several GIS maps displaying the safety level of multilane corridors in the seven counties were generated. The GIS maps were based on a ranking methodology that was developed in research that evaluated the safety condition of road segments and signalized intersections separately. The GIS maps were supported by Excel tables which provided details on the most hazardous locations on the roadways. The results of the roadway level analysis found that the worst corridors were located in Pasco, Pinellas and Hillsborough Counties. Also, a sliding window approach was developed and performed on the ten most hazardous corridors of the seven counties. The results were graphs locating the most dangerous 0.5 miles on a corridor. For the spatial analysis of crashes, the exploratory Moran's I statistic test revealed that crash related spatial clustering existed at the county level. For crash modeling, a full Bayesian (FB) hierarchical model is proposed to account for the possible spatial correlation among crash occurrence of adjacent counties. The spatial correlation is realized by specifying a Conditional Auto-regressive prior to the residual term of the link function in standard Poisson regression. Two FB models were developed, one for total crashes and one for severe crashes. The variables used include traffic related factors and socio-economic factors. Counties with higher road congestion levels, higher densities of arterials and intersections, higher percentage of population in the 15-24 age group and higher income levels have increased crash risk. Road congestion and higher education levels, however, were negatively correlated with the risk of severe crashes. The analysis revealed that crash related spatial correlation existed among the counties. The FB models were found to fit the data better than traditional methods such as Negative Binomial and that is primarily due to the existence of spatial correlation. Overall, this study provides the Transportation Agencies with specific information on where improvements must be implemented to have better safety conditions on the roads of Florida. The study also proves that neighboring counties are more likely to have similar crash trends than the more distant ones.
Show less - Date Issued
- 2009
- Identifier
- CFE0002623, ucf:48204
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0002623
- Title
- Chemical Analysis, Databasing, and Statistical Analysis of Smokeless Powders for Forensic Application.
- Creator
-
Dennis, Dana-Marie, Sigman, Michael, Campiglia, Andres, Yestrebsky, Cherie, Fookes, Barry, Ni, Liqiang, University of Central Florida
- Abstract / Description
-
Smokeless powders are a set of energetic materials, known as low explosives, which are typically utilized for reloading ammunition. There are three types which differ in their primary energetic materials; where single base powders contain nitrocellulose as their primary energetic material, double and triple base powders contain nitroglycerin in addition to nitrocellulose, and triple base powders also contain nitroguanidine. Additional organic compounds, while not proprietary to specific...
Show moreSmokeless powders are a set of energetic materials, known as low explosives, which are typically utilized for reloading ammunition. There are three types which differ in their primary energetic materials; where single base powders contain nitrocellulose as their primary energetic material, double and triple base powders contain nitroglycerin in addition to nitrocellulose, and triple base powders also contain nitroguanidine. Additional organic compounds, while not proprietary to specific manufacturers, are added to the powders in varied ratios during the manufacturing process to optimize the ballistic performance of the powders. The additional compounds function as stabilizers, plasticizers, flash suppressants, deterrents, and opacifiers. Of the three smokeless powder types, single and double base powders are commercially available, and have been heavily utilized in the manufacture of improvised explosive devices.Forensic smokeless powder samples are currently analyzed using multiple analytical techniques. Combined microscopic, macroscopic, and instrumental techniques are used to evaluate the sample, and the information obtained is used to generate a list of potential distributors. Gas chromatography (-) mass spectrometry (GC-MS) is arguably the most useful of the instrumental techniques since it distinguishes single and double base powders, and provides additional information about the relative ratios of all the analytes present in the sample. However, forensic smokeless powder samples are still limited to being classified as either single or double base powders, based on the absence or presence of nitroglycerin, respectively. In this work, the goal was to develop statistically valid classes, beyond the single and double base designations, based on multiple organic compounds which are commonly encountered in commercial smokeless powders. Several chemometric techniques were applied to smokeless powder GC-MS data for determination of the classes, and for assignment of test samples to these novel classes. The total ion spectrum (TIS), which is calculated from the GC-MS data for each sample, is obtained by summing the intensities for each mass-to-charge (m/z) ratio across the entire chromatographic profile. A TIS matrix comprising data for 726 smokeless powder samples was subject to agglomerative hierarchical cluster (AHC) analysis, and six distinct classes were identified. Within each class, a single m/z ratio had the highest intensity for the majority of samples, though the m/z ratio was not always unique to the specific class. Based on these observations, a new classification method known as the Intense Ion Rule (IIR) was developed and used for the assignment of test samples to the AHC designated classes.Discriminant models were developed for assignment of test samples to the AHC designated classes using k-Nearest Neighbors (kNN) and linear and quadratic discriminant analyses (LDA and QDA, respectively). Each of the models were optimized using leave-one-out (LOO) and leave-group-out (LGO) cross-validation, and the performance of the models was evaluated by calculating correct classification rates for assignment of the cross-validation (CV) samples to the AHC designated classes. The optimized models were utilized to assign test samples to the AHC designated classes. Overall, the QDA LGO model achieved the highest correct classification rates for assignment of both the CV samples and the test samples to the AHC designated classes.In forensic application, the goal of an explosives analyst is to ascertain the manufacturer of a smokeless powder sample. In addition, knowledge about the probability of a forensic sample being produced by a specific manufacturer could potentially decrease the time invested by an analyst during investigation by providing a shorter list of potential manufacturers. In this work, Bayes' Theorem and Bayesian Networks were investigated as an additional tool to be utilized in forensic casework. Bayesian Networks were generated and used to calculate posterior probabilities of a test sample belonging to specific manufacturers. The networks were designed to include manufacturer controlled powder characteristics such as shape, color, and dimension; as well as, the relative intensities of the class associated ions determined from cluster analysis. Samples were predicted to belong to a manufacturer based on the highest posterior probability. Overall percent correct rates were determined by calculating the percentage of correct predictions; that is, where the known and predicted manufacturer were the same. The initial overall percent correct rate was 66%. The dimensions of the smokeless powders were added to the network as average diameter and average length nodes. Addition of average diameter and length resulted in an overall prediction rate of 70%.
Show less - Date Issued
- 2015
- Identifier
- CFE0005784, ucf:50059
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005784
- Title
- Development of Traffic Safety Zones and Integrating Macroscopic and Microscopic Safety Data Analytics for Novel Hot Zone Identification.
- Creator
-
Lee, JaeYoung, Abdel-Aty, Mohamed, Radwan, Ahmed, Nam, Boo Hyun, Kuo, Pei-Fen, Choi, Keechoo, University of Central Florida
- Abstract / Description
-
Traffic safety has been considered one of the most important issues in the transportation field. With consistent efforts of transportation engineers, Federal, State and local government officials, both fatalities and fatality rates from road traffic crashes in the United States have steadily declined from 2006 to 2011.Nevertheless, fatalities from traffic crashes slightly increased in 2012 (NHTSA, 2013). We lost 33,561 lives from road traffic crashes in the year 2012, and the road traffic...
Show moreTraffic safety has been considered one of the most important issues in the transportation field. With consistent efforts of transportation engineers, Federal, State and local government officials, both fatalities and fatality rates from road traffic crashes in the United States have steadily declined from 2006 to 2011.Nevertheless, fatalities from traffic crashes slightly increased in 2012 (NHTSA, 2013). We lost 33,561 lives from road traffic crashes in the year 2012, and the road traffic crashes are still one of the leading causes of deaths, according to the Centers for Disease Control and Prevention (CDC). In recent years, efforts to incorporate traffic safety into transportation planning has been made, which is termed as transportation safety planning (TSP). The Safe, Affordable, Flexible Efficient, Transportation Equity Act (-) A Legacy for Users (SAFETEA-LU), which is compliant with the United States Code, compels the United States Department of Transportation to consider traffic safety in the long-term transportation planning process. Although considerable macro-level studies have been conducted to facilitate the implementation of TSP, still there are critical limitations in macroscopic safety studies are required to be investigated and remedied. First, TAZ (Traffic Analysis Zone), which is most widely used in travel demand forecasting, has crucial shortcomings for macro-level safety modeling. Moreover, macro-level safety models have accuracy problem. The low prediction power of the model may be caused by crashes that occur near the boundaries of zones, high-level aggregation, and neglecting spatial autocorrelation.In this dissertation, several methodologies are proposed to alleviate these limitations in the macro-level safety research. TSAZ (Traffic Safety Analysis Zone) is developed as a new zonal system for the macroscopic safety analysis and nested structured modeling method is suggested to improve the model performance. Also, a multivariate statistical modeling method for multiple crash types is proposed in this dissertation. Besides, a novel screening methodology for integrating two levels is suggested. The integrated screening method is suggested to overcome shortcomings of zonal-level screening, since the zonal-level screening cannot take specific sites with high risks into consideration. It is expected that the integrated screening approach can provide a comprehensive perspective by balancing two aspects: macroscopic and microscopic approaches.
Show less - Date Issued
- 2014
- Identifier
- CFE0005195, ucf:50653
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005195
- Title
- Multi-Level Safety Performance Functions for High Speed Facilities.
- Creator
-
Ahmed, Mohamed, Abdel-Aty, Mohamed, Radwan, Ahmed, Al-Deek, Haitham, Mackie, Kevin, Pande, Anurag, Uddin, Nizam, University of Central Florida
- Abstract / Description
-
High speed facilities are considered the backbone of any successful transportation system; Interstates, freeways, and expressways carry the majority of daily trips on the transportation network. Although these types of roads are relatively considered the safest among other types of roads, they still experience many crashes, many of which are severe, which not only affect human lives but also can have tremendous economical and social impacts. These facts signify the necessity of enhancing the...
Show moreHigh speed facilities are considered the backbone of any successful transportation system; Interstates, freeways, and expressways carry the majority of daily trips on the transportation network. Although these types of roads are relatively considered the safest among other types of roads, they still experience many crashes, many of which are severe, which not only affect human lives but also can have tremendous economical and social impacts. These facts signify the necessity of enhancing the safety of these high speed facilities to ensure better and efficient operation. Safety problems could be assessed through several approaches that can help in mitigating the crash risk on long and short term basis. Therefore, the main focus of the research in this dissertation is to provide a framework of risk assessment to promote safety and enhance mobility on freeways and expressways. Multi-level Safety Performance Functions (SPFs) were developed at the aggregate level using historical crash data and the corresponding exposure and risk factors to identify and rank sites with promise (hot-spots). Additionally, SPFs were developed at the disaggregate level utilizing real-time weather data collected from meteorological stations located at the freeway section as well as traffic flow parameters collected from different detection systems such as Automatic Vehicle Identification (AVI) and Remote Traffic Microwave Sensors (RTMS). These disaggregate SPFs can identify real-time risks due to turbulent traffic conditions and their interactions with other risk factors.In this study, two main datasets were obtained from two different regions. Those datasets comprise historical crash data, roadway geometrical characteristics, aggregate weather and traffic parameters as well as real-time weather and traffic data.At the aggregate level, Bayesian hierarchical models with spatial and random effects were compared to Poisson models to examine the safety effects of roadway geometrics on crash occurrence along freeway sections that feature mountainous terrain and adverse weather. At the disaggregate level; a main framework of a proactive safety management system using traffic data collected from AVI and RTMS, real-time weather and geometrical characteristics was provided. Different statistical techniques were implemented. These techniques ranged from classical frequentist classification approaches to explain the relationship between an event (crash) occurring at a given time and a set of risk factors in real time to other more advanced models. Bayesian statistics with updating approach to update beliefs about the behavior of the parameter with prior knowledge in order to achieve more reliable estimation was implemented. Also a relatively recent and promising Machine Learning technique (Stochastic Gradient Boosting) was utilized to calibrate several models utilizing different datasets collected from mixed detection systems as well as real-time meteorological stations. The results from this study suggest that both levels of analyses are important, the aggregate level helps in providing good understanding of different safety problems, and developing policies and countermeasures to reduce the number of crashes in total. At the disaggregate level, real-time safety functions help toward more proactive traffic management system that will not only enhance the performance of the high speed facilities and the whole traffic network but also provide safer mobility for people and goods. In general, the proposed multi-level analyses are useful in providing roadway authorities with detailed information on where countermeasures must be implemented and when resources should be devoted. The study also proves that traffic data collected from different detection systems could be a useful asset that should be utilized appropriately not only to alleviate traffic congestion but also to mitigate increased safety risks. The overall proposed framework can maximize the benefit of the existing archived data for freeway authorities as well as for road users.
Show less - Date Issued
- 2012
- Identifier
- CFE0004508, ucf:49274
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004508