Current Search: time series (x)
View All Items
- Title
- Leaning Robust Sequence Features via Dynamic Temporal Pattern Discovery.
- Creator
-
Hu, Hao, Wang, Liqiang, Zhang, Shaojie, Liu, Fei, Qi, GuoJun, Zhou, Qun, University of Central Florida
- Abstract / Description
-
As a major type of data, time series possess invaluable latent knowledge for describing the real world and human society. In order to improve the ability of intelligent systems for understanding the world and people, it is critical to design sophisticated machine learning algorithms for extracting robust time series features from such latent knowledge. Motivated by the successful applications of deep learning in computer vision, more and more machine learning researchers put their attentions...
Show moreAs a major type of data, time series possess invaluable latent knowledge for describing the real world and human society. In order to improve the ability of intelligent systems for understanding the world and people, it is critical to design sophisticated machine learning algorithms for extracting robust time series features from such latent knowledge. Motivated by the successful applications of deep learning in computer vision, more and more machine learning researchers put their attentions on the topic of applying deep learning techniques to time series data. However, directly employing current deep models in most time series domains could be problematic. A major reason is that temporal pattern types that current deep models are aiming at are very limited, which cannot meet the requirement of modeling different underlying patterns of data coming from various sources. In this study we address this problem by designing different network structures explicitly based on specific domain knowledge such that we can extract features via most salient temporal patterns. More specifically, we mainly focus on two types of temporal patterns: order patterns and frequency patterns. For order patterns, which are usually related to brain and human activities, we design a hashing-based neural network layer to globally encode the ordinal pattern information into the resultant features. It is further generalized into a specially designed Recurrent Neural Networks (RNN) cell which can learn order patterns in an online fashion. On the other hand, we believe audio-related data such as music and speech can benefit from modeling frequency patterns. Thus, we do so by developing two types of RNN cells. The first type tries to directly learn the long-term dependencies on frequency domain rather than time domain. The second one aims to dynamically filter out the ``noise" frequencies based on temporal contexts. By proposing various deep models based on different domain knowledge and evaluating them on extensive time series tasks, we hope this work can provide inspirations for others and increase the community's interests on the problem of applying deep learning techniques to more time series tasks.
Show less - Date Issued
- 2019
- Identifier
- CFE0007470, ucf:52679
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007470
- Title
- Managing IO Resource for Co-running Data Intensive Applications in Virtual Clusters.
- Creator
-
Huang, Dan, Wang, Jun, Zhou, Qun, Sun, Wei, Zhang, Shaojie, Wang, Liqiang, University of Central Florida
- Abstract / Description
-
Today Big Data computer platforms employ resource management systems such as Yarn, Torque, Mesos, and Google Borg to enable sharing the physical computing among many users or applications. Given virtualization and resource management systems, users are able to launch their applications on the same node with low mutual interference and management overhead on CPU and memory. However, there are still challenges to be addressed before these systems can be fully adopted to manage the IO resources...
Show moreToday Big Data computer platforms employ resource management systems such as Yarn, Torque, Mesos, and Google Borg to enable sharing the physical computing among many users or applications. Given virtualization and resource management systems, users are able to launch their applications on the same node with low mutual interference and management overhead on CPU and memory. However, there are still challenges to be addressed before these systems can be fully adopted to manage the IO resources in Big Data File Systems (BDFS) and shared network facilities. In this study, we mainly study on three IO management problems systematically, in terms of the proportional sharing of block IO in container-based virtualization, the network IO contention in MPI-based HPC applications and the data migration overhead in HPC workflows. To improve the proportional sharing, we develop a prototype system called BDFS-Container, by containerizing BDFS at Linux block IO level. Central to BDFS-Container, we propose and design a proactive IOPS throttling based mechanism named IOPS Regulator, which improves proportional IO sharing under the BDFS IO pattern by 74.4% on an average. In the aspect of network IO resource management, we exploit using virtual switches to facilitate network traffic manipulation and reduce mutual interference on the network for in-situ applications. In order to dynamically allocate the network bandwidth when it is needed, we adopt SARIMA-based techniques to analyze and predict MPI traffic issued from simulations. Third, to solve the data migration problem in small-medium sized HPC clusters, we propose to construct a sided IO path, named as SideIO, to explicitly direct analysis data to BDFS that co-locates computation with data. By experimenting with two real-world scientific workflows, SideIO completely avoids the most expensive data movement overhead and achieves up to 3x speedups compared with current solutions.
Show less - Date Issued
- 2018
- Identifier
- CFE0007195, ucf:52268
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007195
- Title
- A STUDY OF EQUATORIAL IONOPSHERIC VARIABILITY USING SIGNAL PROCESSING TECHNIQUES.
- Creator
-
wang, xiaoni, Eastes, Richard, University of Central Florida
- Abstract / Description
-
The dependence of equatorial ionosphere on solar irradiances and geomagnetic activity are studied in this dissertation using signal processing techniques. The statistical time series, digital signal processing and wavelet methods are applied to study the ionospheric variations. The ionospheric data used are the Total Electron Content (TEC) and the critical frequency of the F2 layer (foF2). Solar irradiance data are from recent satellites, the Student Nitric Oxide Explorer (SNOE) satellite and...
Show moreThe dependence of equatorial ionosphere on solar irradiances and geomagnetic activity are studied in this dissertation using signal processing techniques. The statistical time series, digital signal processing and wavelet methods are applied to study the ionospheric variations. The ionospheric data used are the Total Electron Content (TEC) and the critical frequency of the F2 layer (foF2). Solar irradiance data are from recent satellites, the Student Nitric Oxide Explorer (SNOE) satellite and the Thermosphere Ionosphere Mesosphere Energetics Dynamics (TIMED) satellite. The Disturbance Storm-Time (Dst) index is used as a proxy of geomagnetic activity in the equatorial region. The results are summarized as follows. (1) In the short-term variations < 27-days, the previous three days solar irradiances have significant correlation with the present day ionospheric data using TEC, which may contribute 18% of the total variations in the TEC. The 3-day delay between solar irradiances and TEC suggests the effects of neutral densities on the ionosphere. The correlations between solar irradiances and TEC are significantly higher than those using the F10.7 flux, a conventional proxy for short wavelength band of solar irradiances. (2) For variations < 27 days, solar soft X-rays show similar or higher correlations with the ionosphere electron densities than the Extreme Ultraviolet (EUV). The correlations between solar irradiances and foF2 decrease from morning (0.5) to the afternoon (0.1). (3) Geomagnetic activity plays an important role in the ionosphere in short-term variations < 10 days. The average correlation between TEC and Dst is 0.4 at 2-3, 3-5, 5-9 and 9-11 day scales, which is higher than those between foF2 and Dst. The correlations between TEC and Dst increase from morning to afternoon. The moderate/quiet geomagnetic activity plays a distinct role in these short-term variations of the ionosphere (~0.3 correlation).
Show less - Date Issued
- 2007
- Identifier
- CFE0001602, ucf:47188
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001602
- Title
- AN IMPACT EVALUATION OF U.S. ARMS EXPORT CONTROLS ON THE U.S. DEFENSE INDUSTRIAL BASE: AN INTERRUPTED TIME-SERIES ANALYSIS.
- Creator
-
Condron, Aaron, Sweo, Robert, University of Central Florida
- Abstract / Description
-
The United States Defense Industrial Base (USDIB) is an essential industry to both the economic prosperity of the US and its strategic control over many advanced military systems and technologies. The USDIB, which encompasses the industries of aerospace and defense, is a volatile industry - prone to many internal and external factors that cause demand to ebb and flow widely year over year. Among the factors that influence the volume of systems the USDIB delivers to its international customers...
Show moreThe United States Defense Industrial Base (USDIB) is an essential industry to both the economic prosperity of the US and its strategic control over many advanced military systems and technologies. The USDIB, which encompasses the industries of aerospace and defense, is a volatile industry - prone to many internal and external factors that cause demand to ebb and flow widely year over year. Among the factors that influence the volume of systems the USDIB delivers to its international customers are the arms export controls of the US. These controls impose a divergence from the historical US foreign policy of furthering an open exchange of ideas and liberalized trade. These controls, imposed by the Departments of Commerce, Defense, and State rigidly control all international presence of the Industry. The overlapping controls create an inability to conform to rapidly changing realpolitiks, leaving these controls in an archaic state. This, in turn, imposes a great deal of anxiety and expense upon managers within and outside of the USDIB. Using autoregressive integrated moving average time-series analyses, this paper confirms that the implementation of or amendment to broad arms export controls correlates to significant and near immediate declines in USDIB export volumes. In the context of the US's share of world arms exports, these controls impose up to a 20% decline in export volume.
Show less - Date Issued
- 2011
- Identifier
- CFH0004064, ucf:44785
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFH0004064
- Title
- INVESTIGATION OF DAMAGE DETECTION METHODOLOGIES FOR STRUCTURAL HEALTH MONITORING.
- Creator
-
Gul, Mustafa, Catbas, F. Necati, University of Central Florida
- Abstract / Description
-
Structural Health Monitoring (SHM) is employed to track and evaluate damage and deterioration during regular operation as well as after extreme events for aerospace, mechanical and civil structures. A complete SHM system incorporates performance metrics, sensing, signal processing, data analysis, transmission and management for decision-making purposes. Damage detection in the context of SHM can be successful by employing a collection of robust and practical damage detection methodologies...
Show moreStructural Health Monitoring (SHM) is employed to track and evaluate damage and deterioration during regular operation as well as after extreme events for aerospace, mechanical and civil structures. A complete SHM system incorporates performance metrics, sensing, signal processing, data analysis, transmission and management for decision-making purposes. Damage detection in the context of SHM can be successful by employing a collection of robust and practical damage detection methodologies that can be used to identify, locate and quantify damage or, in general terms, changes in observable behavior. In this study, different damage detection methods are investigated for global condition assessment of structures. First, different parametric and non-parametric approaches are re-visited and further improved for damage detection using vibration data. Modal flexibility, modal curvature and un-scaled flexibility based on the dynamic properties that are obtained using Complex Mode Indicator Function (CMIF) are used as parametric damage features. Second, statistical pattern recognition approaches using time series modeling in conjunction with outlier detection are investigated as a non-parametric damage detection technique. Third, a novel methodology using ARX models (Auto-Regressive models with eXogenous output) is proposed for damage identification. By using this new methodology, it is shown that damage can be detected, located and quantified without the need of external loading information. Next, laboratory studies are conducted on different test structures with a number of different damage scenarios for the evaluation of the techniques in a comparative fashion. Finally, application of the methodologies to real life data is also presented along with the capabilities and limitations of each approach in light of analysis results of the laboratory and real life data.
Show less - Date Issued
- 2009
- Identifier
- CFE0002830, ucf:48069
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0002830
- Title
- Leader Psychology and Civil War Behavior.
- Creator
-
Smith, Gary, Schafer, Mark, Kang, Kyungkook, Powell, Jonathan, Walker, Stephen, University of Central Florida
- Abstract / Description
-
How do the psychological characteristics of world leaders affect civil wars? Multiple studies have investigated how the personalities and beliefs of world leaders affect foreign policy preferences and outcomes. However, this research has yet to be applied to the intrastate context, which is problematic, given the growing importance of civil wars in the conflict-studies literature. This dissertation project utilizes at-a-distance profiling methods to investigate how leaders and their...
Show moreHow do the psychological characteristics of world leaders affect civil wars? Multiple studies have investigated how the personalities and beliefs of world leaders affect foreign policy preferences and outcomes. However, this research has yet to be applied to the intrastate context, which is problematic, given the growing importance of civil wars in the conflict-studies literature. This dissertation project utilizes at-a-distance profiling methods to investigate how leaders and their psychological characteristics can affect the likelihood, severity, and duration of civil conflicts. The findings of this research provide further support for the general hypothesis that leaders can, and often do, matter when trying to explain policy outcomes. More importantly, the findings demonstrate that leaders can influence the likelihood of civil war onset, the severity of civil wars, and their duration. Additionally, this project investigates the effect that civil war severity has on the psychological characteristics of leaders. Contrary to some previous research, however, the findings here indicate that leaders' psychology may not be sensitive to civil conflict severity.
Show less - Date Issued
- 2018
- Identifier
- CFE0007375, ucf:52089
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007375
- Title
- SPATIO-TEMPORAL ANALYSES FOR PREDICTION OF TRAFFIC FLOW, SPEED AND OCCUPANCY ON I-4.
- Creator
-
Chilakamarri Venkata, Srinivasa Ravi Chandra, Al-Deek, Haitham, University of Central Florida
- Abstract / Description
-
Traffic data prediction is a critical aspect of Advanced Traffic Management System (ATMS). The utility of the traffic data is in providing information on the evolution of traffic process that can be passed on to the various users (commuters, Regional Traffic Management Centers (RTMCs), Department of Transportation (DoT),
etc) for user-specific objectives. This information can be extracted from the data collected by various traffic sensors. Loop detectors collect traffic data in the form of...
Show moreTraffic data prediction is a critical aspect of Advanced Traffic Management System (ATMS). The utility of the traffic data is in providing information on the evolution of traffic process that can be passed on to the various users (commuters, Regional Traffic Management Centers (RTMCs), Department of Transportation (DoT), etc) for user-specific objectives. This information can be extracted from the data collected by various traffic sensors. Loop detectors collect traffic data in the form of flow, occupancy, and speed throughout the nation. Freeway traffic data from I-4 loop detectors has been collected and stored in a data warehouse called the Central Florida Data Warehouse (CFDWTM) by the University of Central Florida for the periods between 1993 1994 and 2000 - 2003. This data is raw, in the form of time stamped 30-second aggregated data collected from about 69 stations over a 36 mile stretch on I-4 from Lake Mary in the east to Disney-World in the west. This data has to be processed to extract information that can be disseminated to various users. Usually, most statistical procedures assume that each individual data point in the sample is independent of other data points. This is not true to traffic data as they are correlated across space and time. Therefore, the concept of time sequence and the layout of data collection devices in space, introduces autocorrelations in a single variable and cross correlations across multiple variables. Significant autocorrelations prove that past values of a variable can be used to predict future values of the same variable. Furthermore, significant cross-correlations between variables prove that past values of one variable can be used to predict future values of another variable. The traditional techniques in traffic prediction use univariate time series models that account for autocorrelations but not cross-correlations. These models have neglected the cross correlations between variables that are present in freeway traffic data, due to the way the data are collected. There is a need for statistical techniques that incorporate the effect of these multivariate cross-correlations to predict future values of traffic data. The emphasis in this dissertation is on the multivariate prediction of traffic variables. Unlike traditional statistical techniques that have relied on univariate models, this dissertation explored the cross-correlation between multivariate traffic variables and variables collected across adjoining spatial locations (such as loop detector stations). The analysis in this dissertation proved that there were significant cross correlations among different traffic variables collected across very close locations at different time scales. The nature of cross-correlations showed that there was feedback among the variables, and therefore past values can be used to predict future values. Multivariate time series analysis is appropriate for modeling the effect of different variables on each other. In the past, upstream data has been accounted for in time series analysis. However, these did not account for feedback effects. Vector Auto Regressive (VAR) models are more appropriate for such data. Although VAR models have been applied to forecast economic time series models, they have not been used to model freeway data. Vector Auto Regressive models were estimated for speeds and volumes at a sample of two locations, using 5-minute data. Different specifications were fit estimation of speeds from surrounding speeds; estimation of volumes from surrounding volumes; estimation of speeds from volumes and occupancies from the same location; estimation of speeds from volumes from surrounding locations (and vice versa). These specifications were compared to univariate models for the respective variables at three levels of data aggregation (5-minutes, 10 minutes, and 15 minutes) in this dissertation. For data aggregation levels of <15 minutes, the VAR models outperform the univariate models. At data aggregation level of 15 minutes, VAR models did not outperform univariate models. Since VAR models were used for all traffic variables reported by the loop detectors, this made the application of VAR a true multivariate procedure for dynamic prediction of the multivariate traffic variables flow, speed and occupancy. Also, VAR models are generally deemed more complex than univariate models due to the estimation of multiple covariance matrices. However, a VAR model for k variables must be compared to k univariate models and VAR models compare well with AutoRegressive Integrated Moving Average (ARIMA) models. The added complexity helps model the effect of upstream and downstream variables on the future values of the response variable. This could be useful for ATMS situations, where the effect of traffic redistribution and redirection is not known beforehand with prediction models. The VAR models were tested against more traditional models and their performances were compared against each other under different traffic conditions. These models significantly enhance the understanding of the freeway traffic processes and phenomena as well as identifying potential knowledge relating to traffic prediction. Further refinements in the models can result in better improvements for forecasts under multiple conditions.
Show less - Date Issued
- 2009
- Identifier
- CFE0002593, ucf:48276
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0002593
- Title
- STRUCTURAL HEALTH MONITORING FOR DAMAGE DETECTION USING WIRED AND WIRELESS SENSOR CLUSTERS.
- Creator
-
Terrell, Thomas, Catbas, Necati, University of Central Florida
- Abstract / Description
-
Sensing and analysis of a structure for the purpose of detecting, tracking, and evaluating damage and deterioration, during both regular operation and extreme events, is referred to as Structural Health Monitoring (SHM). SHM is a multi-disciplinary field, with a complete system incorporating sensing technology, hardware, signal processing, networking, data analysis, and management for interpretation and decision making. However, many of these processes and subsequent integration into a...
Show moreSensing and analysis of a structure for the purpose of detecting, tracking, and evaluating damage and deterioration, during both regular operation and extreme events, is referred to as Structural Health Monitoring (SHM). SHM is a multi-disciplinary field, with a complete system incorporating sensing technology, hardware, signal processing, networking, data analysis, and management for interpretation and decision making. However, many of these processes and subsequent integration into a practical SHM framework are in need of development. In this study, various components of an SHM system will be investigated. A particular focus is paid to the investigation of a previously developed damage detection methodology for global condition assessment of a laboratory structure with a decking system. First, a review of some of the current SHM applications, which relate to a current UCF Structures SHM study monitoring a full-scale movable bridge, will be presented in conjunction with a summary of the critical components for that project. Studies for structural condition assessment of a 4-span bridge-type steel structure using the SHM data collected from laboratory based experiments will then be presented. For this purpose, a time series analysis method using ARX models (Auto-Regressive models with eXogeneous input) for damage detection with free response vibration data will be expanded upon using both wired and wireless acceleration data. Analysis using wireless accelerometers will implement a sensor roaming technique to maintain a dense sensor field, yet require fewer sensors. Using both data types, this ARX based time series analysis method was shown to be effective for damage detection and localization for this relatively complex laboratory structure. Finally, application of the proposed methodologies on a real-life structure will be discussed, along with conclusions and recommendations for future work.
Show less - Date Issued
- 2011
- Identifier
- CFE0003694, ucf:48837
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0003694
- Title
- STRUCTURAL CAUSES OF SOCIAL CONFLICT IN AFRICA.
- Creator
-
Charland, Lucien, Dolan, Thomas, University of Central Florida
- Abstract / Description
-
Social conflict, as opposed to armed conflict, has received less attention in the field of quantitative research. This paper investigates the structural causes of political violence in 35 African states using data from the Social Conflict in Africa dataset and the Beck and Katz panel corrected standard errors time series regression model. Theoretically, a closed political opportunity structure, combined with a weak state unable to provide public goods, should together produce high levels of...
Show moreSocial conflict, as opposed to armed conflict, has received less attention in the field of quantitative research. This paper investigates the structural causes of political violence in 35 African states using data from the Social Conflict in Africa dataset and the Beck and Katz panel corrected standard errors time series regression model. Theoretically, a closed political opportunity structure, combined with a weak state unable to provide public goods, should together produce high levels of social conflict. The independent variables attempt to operationalize these concepts from four different angles. In this analysis Access to Education and Infrastructure (AEI), Ethno Linguistic Fractionalization (ELF), Freedom in the World Political Rights (FIW), and National Material Capabilities (NMC) were all significant predictors of social conflict. This study found that as the level of ethnic fractionalization and material capabilities within states rose, the frequency of social conflict events also increased. However, as access to infrastructure and political rights declined, the number of social conflict events increased. Wald chi-square and R-square values suggest that the model is complete and has substantial explanatory power.
Show less - Date Issued
- 2014
- Identifier
- CFH0004663, ucf:45314
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFH0004663
- Title
- Data-Driven Modeling and Optimization of Building Energy Consumption.
- Creator
-
Grover, Divas, Pourmohammadi Fallah, Yaser, Vosoughi, Azadeh, Zhou, Qun, University of Central Florida
- Abstract / Description
-
Sustainability and reducing energy consumption are targets for building operations. The installation of smart sensors and Building Automation Systems (BAS) makes it possible to study facility operations under different circumstances. These technologies generate large amounts of data. That data can be scrapped and used for the analysis. In this thesis, we focus on the process of data-driven modeling and decision making from scraping the data to simulate the building and optimizing the...
Show moreSustainability and reducing energy consumption are targets for building operations. The installation of smart sensors and Building Automation Systems (BAS) makes it possible to study facility operations under different circumstances. These technologies generate large amounts of data. That data can be scrapped and used for the analysis. In this thesis, we focus on the process of data-driven modeling and decision making from scraping the data to simulate the building and optimizing the operation. The City of Orlando has similar goals of sustainability and reduction of energy consumption so, they provided us access to their BAS for the data and study the operation of its facilities. The data scraped from the City's BAS serves can be used to develop statistical/machine learning methods for decision making. We selected a mid-size pilot building to apply these techniques. The process begins with the collection of data from BAS. An Application Programming Interface (API) is developed to login to the servers and scrape data for all data points and store it on the local machine. Then data is cleaned to analyze and model. The dataset contains various data points ranging from indoor and outdoor temperature to fan speed inside the Air Handling Unit (AHU) which are operated by Variable Frequency Drive (VFD). This whole dataset is a time series and is handled accordingly. The cleaned dataset is analyzed to find different patterns and investigate relations between different data points. The analysis helps us in choosing parameters for models that are developed in the next step. Different statistical models are developed to simulate building and equipment behavior. Finally, the models along with the data are used to optimize the building Operation with the equipment constraints to make decisions for building operation which leads to a reduction in energy consumption while maintaining temperature and pressure inside the building.
Show less - Date Issued
- 2019
- Identifier
- CFE0007810, ucf:52335
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007810
- Title
- Risk Management in Reservoir Operations in the Context of Undefined Competitive Consumption.
- Creator
-
Salami, Yunus, Nnadi, Fidelia, Wang, Dingbao, Chopra, Manoj, Rowney, Alexander, Divo, Eduardo, University of Central Florida
- Abstract / Description
-
Dams and reservoirs with multiple purposes require effective management to fully realize their purposes and maximize efficiency. For instance, a reservoir intended mainly for the purposes of flood control and hydropower generation may result in a system with primary objectives that conflict with each other. This is because higher hydraulic heads are required to achieve the hydropower generation objective while relatively lower reservoir levels are required to fulfill flood control objectives....
Show moreDams and reservoirs with multiple purposes require effective management to fully realize their purposes and maximize efficiency. For instance, a reservoir intended mainly for the purposes of flood control and hydropower generation may result in a system with primary objectives that conflict with each other. This is because higher hydraulic heads are required to achieve the hydropower generation objective while relatively lower reservoir levels are required to fulfill flood control objectives. Protracted imbalances between these two could increase the susceptibility of the system to risks of water shortage or flood, depending on inflow volumes and operational policy effectiveness. The magnitudes of these risks can become even more pronounced when upstream use of the river is unregulated and uncoordinated so that upstream consumptions and releases are arbitrary. As a result, safe operational practices and risk management alternatives must be structured after an improved understanding of historical and anticipated inflows, actual and speculative upstream uses, and the overall hydrology of catchments upstream of the reservoir. One of such systems with an almost yearly occurrence of floods and shortages due to both natural and anthropogenic factors is the dual reservoir system of Kainji and Jebba in Nigeria. To analyze and manage these risks, a methodology that combines a stochastic and deterministic approach was employed. Using methods outlined by Box and Jenkins (1976), autoregressive integrated moving average (ARIMA) models were developed for forecasting Niger river inflows at Kainji reservoir based on twenty-seven-year-long historical inflow data (1970-1996). These were then validated using seven-year inflow records (1997-2003). The model with the best correlation was a seasonal multiplicative ARIMA (2,1,1)x(2,1,2)12 model. Supplementary validation of this model was done with discharge rating curves developed for the inlet of the reservoir using in situ inflows and satellite altimetry data. By comparing net inflow volumes with storage deficit, flood and shortage risk factors at the reservoir were determined based on (a) actual inflows, (b) forecasted inflows (up to 2015), and (c) simulated scenarios depicting undefined competitive upstream consumption. Calculated high-risk years matched actual flood years again suggesting the reliability of the model. Monte Carlo simulations were then used to prescribe safe outflows and storage allocations in order to reduce futuristic risk factors. The theoretical safety levels achieved indicated risk factors below threshold values and showed that this methodology is a powerful tool for estimating and managing flood and shortage risks in reservoirs with undefined competitive upstream consumption.
Show less - Date Issued
- 2012
- Identifier
- CFE0004593, ucf:49193
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004593
- Title
- Evaluation of crash modification factors and functions including time trends at intersections.
- Creator
-
Wang, Jung-Han, Abdel-Aty, Mohamed, Radwan, Essam, Eluru, Naveen, Lee, JaeYoung, Wang, Chung-Ching, University of Central Florida
- Abstract / Description
-
Traffic demand has increased as population increased. The US population reached 313,914,040 in 2012 (US Census Bureau, 2015). Increased travel demand may have potential impact on roadway safety and the operational characteristics of roadways. Total crashes and injury crashes at intersections accounted for 40% and 44% of traffic crashes, respectively, on US roadways in 2007 according to the Intersection Safety Issue Brief (FHWA, 2009). Traffic researchers and engineers have developed a...
Show moreTraffic demand has increased as population increased. The US population reached 313,914,040 in 2012 (US Census Bureau, 2015). Increased travel demand may have potential impact on roadway safety and the operational characteristics of roadways. Total crashes and injury crashes at intersections accounted for 40% and 44% of traffic crashes, respectively, on US roadways in 2007 according to the Intersection Safety Issue Brief (FHWA, 2009). Traffic researchers and engineers have developed a quantitative measure of the safety effectiveness of treatments in the form of crash modification factors (CMF). Based on CMFs from multiple studies, the Highway Safety Manual (HSM) Part D (AASHTO, 2010) provides CMFs which can be used to determine the expected number of crash reduction or increase after treatments were installed. Even though CMFs have been introduced in the HSM, there are still limitations that require to be investigated. One important potential limitation is that the HSM provides various CMFs as fixed values, rather than CMFs under different configurations. In this dissertation, the CMFs were estimated using the observational before-after study to show that the CMFs vary across different traffic volume levels when signalizing intersections. Besides screening the effect of traffic volume, previous studies showed that CMFs could vary over time after the treatment was implemented. Thus, in this dissertation, the trends of CMFs for the signalization and adding red light running cameras (RLCs) were evaluated. CMFs for these treatments were measured in each month and 90- day moving windows using the time series ARMA model. The results of the signalization show that the CMFs for rear-end crashes were lower at the early phase after the signalization but gradually increased from the 9th month. Besides, it was also found that the safety effectiveness is significantly worse 18 months after installing RLCs.Although efforts have been made to seek reliable CMFs, the best estimate of CMFs is still widely debated. Since CMFs are non-zero estimates, the population of all CMFs does not follow normal distributions and even if it did, the true mean of CMFs at some intersections may be different than that at others. Therefore, a bootstrap method was proposed to estimate CMFs that makes no distributional assumptions. Through examining the distribution of CMFs estimated by bootstrapped resamples, a CMF precision rating method is suggested to evaluate the reliability of the estimated CMFs. The result shows that the estimated CMF for angle+left-turn crashes after signalization has the highest precision, while estimates of the CMF for rear-end crashes are extremely unreliable. The CMFs for KABCO, KABC, and KAB crashes proved to be reliable for the majority of intersections, but the estimated effect of signalization may not be accurate at some sites.In addition, the bootstrap method provides a quantitative measure to identify the reliability of CMFs, however, the CMF transferability is questionable. Since the development of CMFs requires safety performance functions (SPFs), could CMFs be developed using the SPFs from other states in the United States? This research applies the empirical Bayes method to develop CMFs using several SPFs from different jurisdictions and adjusted by calibration factors. After examination, it is found that applying SPFs from other jurisdictions is not desired when developing CMFs.The process of estimating CMFs using before-after studies requires the understanding of multiple statistical principles. In order to simplify the process of CMF estimation and make the CMFs research reproducible. This dissertation includes an open source statistics package built in R (R, 2013) to make the estimation accessible and reproducible. With this package, authorities are able to estimate reliable CMFs following the procedure suggested by FHWA. In addition, this software package equips a graphical interface which integrates the algorithm of calculating CMFs so that users can perform CMF calculation with minimum programming prerequisite. Expected contributions of this study are to 1) propose methodologies for CMFs to assess the variation of CMFs with different characteristics among treated sites, 2) suggest new objective criteria to judge the reliability of safety estimation, 3) examine the transferability of SPFs when developing CMF using before-after studies, and 4) develop a statistics software to calculate CMFs. Finally, potential relevant applications beyond the scope of this research, but worth investigation in the future are discussed in this dissertation.
Show less - Date Issued
- 2016
- Identifier
- CFE0006413, ucf:51454
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006413
- Title
- COUNTER-TERRORISM: WHEN DO STATES ADOPT NEW ANTI-TERROR LEGISLATION?.
- Creator
-
Clesca, Princelee, Dolan, Thomas, University of Central Florida
- Abstract / Description
-
The intent of this thesis is to research the anti-terror legislation of 15 countries and the history of terrorist incidents within those countries. Both the anti-terror legislation and the history of terrorist incidents will be researched within the time period of 1980 to 2009, a 30 year span. This thesis will seek to establish a relationship between the occurrence of terrorist events and when states change their anti-terror legislation. Legislation enacted can vary greatly. Common changes in...
Show moreThe intent of this thesis is to research the anti-terror legislation of 15 countries and the history of terrorist incidents within those countries. Both the anti-terror legislation and the history of terrorist incidents will be researched within the time period of 1980 to 2009, a 30 year span. This thesis will seek to establish a relationship between the occurrence of terrorist events and when states change their anti-terror legislation. Legislation enacted can vary greatly. Common changes in legislation seek to undercut the financing of terrorist organizations, criminalize behaviors, or empower state surveillance capabilities. A quantitative analysis will be performed to establish a relationship between terrorist attacks and legislative changes. A qualitative discussion will follow to analyze specific anti-terror legislation passed by states in response to terrorist events.
Show less - Date Issued
- 2015
- Identifier
- CFH0004851, ucf:45451
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFH0004851
- Title
- Learning Dynamic Network Models for Complex Social Systems.
- Creator
-
Hajibagheri, Alireza, Sukthankar, Gita, Turgut, Damla, Chatterjee, Mainak, Lakkaraju, Kiran, University of Central Florida
- Abstract / Description
-
Human societies are inherently complex and highly dynamic, resulting in rapidly changing social networks, containing multiple types of dyadic interactions. Analyzing these time-varying multiplex networks with approaches developed for static, single layer networks often produces poor results. To address this problem, our approach is to explicitly learn the dynamics of these complex networks. This dissertation focuses on five problems: 1) learning link formation rates; 2) predicting changes in...
Show moreHuman societies are inherently complex and highly dynamic, resulting in rapidly changing social networks, containing multiple types of dyadic interactions. Analyzing these time-varying multiplex networks with approaches developed for static, single layer networks often produces poor results. To address this problem, our approach is to explicitly learn the dynamics of these complex networks. This dissertation focuses on five problems: 1) learning link formation rates; 2) predicting changes in community membership; 3) using time series to predict changes in network structure; 4) modeling coevolution patterns across network layers and 5) extracting information from negative layers of a multiplex network.To study these problems, we created a rich dataset extracted from observing social interactions in the massively multiplayer online game Travian. Most online social media platforms are optimized to support a limited range of social interactions, primarily focusing on communication and information sharing. In contrast, relations in massively-multiplayer online games (MMOGs) are often formed during the course of gameplay and evolve as the game progresses. To analyze the players' behavior, we constructed multiplex networks with link types for raid, communication, and trading.The contributions of this dissertation include 1) extensive experiments on the dynamics of networks formed from diverse social processes; 2) new game theoretic models for community detection in dynamic networks; 3) supervised and unsupervised methods for link prediction in multiplex coevolving networks for both positive and negative links. We demonstrate that our holistic approach for modeling network dynamics in coevolving, multiplex networks outperforms factored methods that separately consider temporal and cross-layer patterns.
Show less - Date Issued
- 2017
- Identifier
- CFE0006598, ucf:51306
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006598
- Title
- Constructing and Validating an Integrative Economic Model of Health Care Systems and Health Care Markets: A Comparative Analysis of OECD Countries.
- Creator
-
Helligso, Jesse, Wan, Thomas, Liu, Albert Xinliang, King, Christian, Hamann, Kerstin, University of Central Florida
- Abstract / Description
-
This dissertation argues that there are three basic types of health care systems used in industrial nations: free market (private insurance and provision), universal (public insurance and private provision), and socialized (public insurance and provision). It examines the role of market forces (supply and demand) within the health care systems and their effects on health outcomes by constructing an integrative model of health care markets and policies that is lacking within the scientific and...
Show moreThis dissertation argues that there are three basic types of health care systems used in industrial nations: free market (private insurance and provision), universal (public insurance and private provision), and socialized (public insurance and provision). It examines the role of market forces (supply and demand) within the health care systems and their effects on health outcomes by constructing an integrative model of health care markets and policies that is lacking within the scientific and academic literature. The results show that, free market systems have decreased access to care, good quality of care, and are economically inefficient resulting in 2.7 years of life expectancy lost and wasted expenditures (expenditures that do not increase life expectancy) of $3474 per capita ($1.12 trillion per year in the U.S.). Socialized systems are the most economically efficient systems but have decreased access to care compared to universal systems, increased access to care compared to free market systems and have the lowest quality of care of all three systems resulting in 3 months of life expectancy lost per capita and a saving of $335 per capita. Universal systems perform better than either of the other 2 systems based on quality and access to care. The models show that health insurance is a Giffen Good; a good that defies the law of demand. This study is the first fully demonstrated case of a Giffen good. This investigation shows how the theoretically informed integrative model behaves as predicted and influences health outcomes contingent upon the system type. To test and substantiate this integrative model, regression analysis, Time-Series-Cross-Section analysis, and structural equation modeling were performed using longitudinal data provided and standardized by the Organization for Economic Cooperation and Development (OECD). The results demonstrate that universal health care systems are superior to the other two systems.
Show less - Date Issued
- 2018
- Identifier
- CFE0007335, ucf:52114
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007335