Current Search: moving (x)
-
-
Title
-
Motivational Factors and Barriers Affecting Seniors' Decision to Relocate to a Senior Living Facility.
-
Creator
-
Chaulagain, Suja, Pizam, Abraham, Wang, Youcheng, Severt, Denver, Oetjen, Reid, University of Central Florida
-
Abstract / Description
-
This study aimed to explore factors affecting seniors' intention to relocate to a senior living facility. More specifically, the purpose of this study was to examine the influence of push and pull motivational factors and perceived barriers on seniors' intention to relocate to a senior living facility (SLF). In addition, the mediating role of perceived barriers on the relationships between push motivational factors and intention to relocate and pull motivational factors and intention to...
Show moreThis study aimed to explore factors affecting seniors' intention to relocate to a senior living facility. More specifically, the purpose of this study was to examine the influence of push and pull motivational factors and perceived barriers on seniors' intention to relocate to a senior living facility (SLF). In addition, the mediating role of perceived barriers on the relationships between push motivational factors and intention to relocate and pull motivational factors and intention to relocate were explored. The data of the study was collected from 363 seniors. Structural equation modeling (SEM) analysis was conducted to test the study hypotheses.The results indicated that health related, social and family/friend related, housing and property related push motivational factors and facility related pull motivational factor positively influenced seniors' intention to relocate to SLFs. In addition, the study results revealed that family related barriers, economic barriers, socio-psychological barriers, and knowledge and information barriers negatively affected seniors' intention to relocate to SLFs. In terms of the mediation effects, the study results indicated that (1) family related barriers mediated the positive relationship between health related push motivational factor and intention to relocate; (2) economic barriers mediated the positive relationship between facility related pull motivational factor and intention to relocate; (3) socio-psychological barriers mediated the positive relationship between health related push motivational factor and intention to relocate; and (4) socio-psychological barriers mediated the positive relationship between facility related pull motivational factor and intention to relocate. The findings of this study provide valuable theoretical contributions in the context senior living literature and important practical implications for SLF operators, health care facilitators and government agencies.
Show less
-
Date Issued
-
2019
-
Identifier
-
CFE0007611, ucf:52522
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0007611
-
-
Title
-
Spatial Models with Specific Error Structures.
-
Creator
-
Adu, Nathaniel, Richardson, Gary, Mohapatra, Ram, Song, Zixia, Lang, Sheau-Dong, University of Central Florida
-
Abstract / Description
-
The purpose of this dissertation is to study the first order autoregressive model in the spatial context with specific error structures. We begin by supposing that the error structure has a long memory in both the i and the j components. Whenever the model parameters alpha and beta equal one, the limiting distribution of the sequence of normalized Fourier coefficients of the spatial process is shown to be a function of a two parameter fractional Brownian sheet. This result is used to find the...
Show moreThe purpose of this dissertation is to study the first order autoregressive model in the spatial context with specific error structures. We begin by supposing that the error structure has a long memory in both the i and the j components. Whenever the model parameters alpha and beta equal one, the limiting distribution of the sequence of normalized Fourier coefficients of the spatial process is shown to be a function of a two parameter fractional Brownian sheet. This result is used to find the limiting distribution of the periodogram ordinate of the spatial process under the null hypothesis that alpha equals one and beta equals one. We then give the limiting distribution of the normalized Fourier coefficients of the spatial process for both a moving average and autoregressive error structure. Two cases of autoregressive errors are considered. The first error model is autoregressive in one component and the second is autoregressive in both components. We show that the normalizing factor needed to ensure convergence in distribution of the sequence of Fourier coefficients is different in the moving average case, and the two autoregressive cases. In other words, the normalizing factor differs in each of these three cases.Finally, a specific case of the functional central limit theorem in the spatial setting is stated and proved. The assumptions made here are placed on the autocovariance functions. We then discuss some specific examples and provide a test statistics based on the periodogram ordinate.
Show less
-
Date Issued
-
2019
-
Identifier
-
CFE0007772, ucf:52385
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0007772
-
-
Title
-
CO Florida 2012, A MOVES-Based, Near-Road, Screening Model.
-
Creator
-
Ritner, Mark, Cooper, Charles, Radwan, Ahmed, Randall, Andrew, University of Central Florida
-
Abstract / Description
-
Citizens in the United States are fortunate to have an excellent system of roadways and the affluence with which to afford automobiles. The flexibility of travel on demand for most allows for a variety of lifestyles, assists with conducting business, and contributes to the feeling of freedom that most citizens enjoy. The current vehicle fleet, which is primarily powered by internal combustion engines burning fossil fuels, does however contribute to the deterioration of air quality. This...
Show moreCitizens in the United States are fortunate to have an excellent system of roadways and the affluence with which to afford automobiles. The flexibility of travel on demand for most allows for a variety of lifestyles, assists with conducting business, and contributes to the feeling of freedom that most citizens enjoy. The current vehicle fleet, which is primarily powered by internal combustion engines burning fossil fuels, does however contribute to the deterioration of air quality. This effect is particularly significant in metropolitan areas. Motor vehicle exhausts contain several combustion bi-products that pose harmful effects to the environment and human health, in particular. The United States Environmental Protection Agency (EPA) and the Federal Highway Administration (FHWA) have selected carbon monoxide (CO) as the air pollutant on which it has based its guidelines for assessing potential air quality impacts from roadway construction (EPA 1992).The design of roadway networks must consider traffic flows, Level of Service (LOS), cost, and National Ambient Air Quality Standards (NAAQS) requirements. In light of the environmental standards it is necessary to model to estimate potential future near-road concentrations of CO. This modeling has two aspects, first determining the rate of pollutant emissions, and second determining how those pollutants disperse near the road. Obtaining a precise, realistic estimate of the near-road CO concentrations under a wide variety of weather and traffic patterns is a potentially huge undertaking. With budgetary constraints in mind, the development of a screening model is appropriate. CO Florida 2012 (COFL2012) is such a model that uses conservative assumptions to predict worst-case, near-road CO concentration. Projects that pass a COFL2012 model run do not require additional air quality modeling. Projects that fail a COFL2012 model run, however, may still be viable, but will require additional, detailed modeling and possibly project modifications.COFL2012 uses tables of emission factors (EFs) that were derived from numerous runs of the EPA's MOtor Vehicle Emission Simulator (MOVES2010a), which is indicated as the preferred model for near-road modeling of CO.(EPA 2009) COFL2012 then inputs the EFs, along with assumed link configurations, geographical assumptions, and user-inputted traffic information into input files that are run through CAL3QHC Version 2.0 (CAL3QHC2), the EPA's approved near-road dispersion model (EPA 1995).COFL2012 is a brand new Florida CO screening model, written from scratch. This author has written the computer code for COFL2012 in Visual Basic, using Microsoft Visual Studios 2010. Visual Studios utilizes the .net Framework 4. COFL2012 is easy to learn, quick to operate, and has been written to allow for future updates simply and easily, whenever the EPA releases updates to the databases that feed MOVES2010a.
Show less
-
Date Issued
-
2012
-
Identifier
-
CFE0004233, ucf:49011
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0004233
-
-
Title
-
Modeling Autocorrelation and Sample Weights in Panel Data: A Monte Carlo Simulation Study.
-
Creator
-
Acharya, Parul, Sivo, Stephen, Hahs-Vaughn, Debbie, Witta, Eleanor, Butler, Malcolm, University of Central Florida
-
Abstract / Description
-
This dissertation investigates the interactive or joint influence of autocorrelative processes (autoregressive-AR, moving average-MA, and autoregressive moving average-ARMA) and sample weights present in a longitudinal panel data set. Specifically, to what extent are the sample estimates influenced when autocorrelation (which is usually present in a panel data having correlated observations and errors) and sample weights (complex sample design feature used in longitudinal data having multi...
Show moreThis dissertation investigates the interactive or joint influence of autocorrelative processes (autoregressive-AR, moving average-MA, and autoregressive moving average-ARMA) and sample weights present in a longitudinal panel data set. Specifically, to what extent are the sample estimates influenced when autocorrelation (which is usually present in a panel data having correlated observations and errors) and sample weights (complex sample design feature used in longitudinal data having multi-stage sampling design) are modeled versus when they are not modeled or either one of them is taken into account. The current study utilized a Monte Carlo simulation design to vary the type and magnitude of autocorrelative processes and sample weights as factors incorporated in growth or latent curve models to evaluate the effect on sample latent curve estimates (mean intercept, mean slope, intercept variance, slope variance, and intercept slope correlation). Various latent curve models with weights or without weights were specified with an autocorrelative process and then fitted to data sets having either the AR, MA or ARMA process. The relevance and practical importance of the simulation results were ascertained by testing the joint influence of autocorrelation and weights on the Early Childhood Longitudinal Study for Kindergartens (ECLS-K) data set which is a panel data set having complex sample design features. The results indicate that autocorrelative processes and weights interact with each other as sources of error to a statistically significant degree. Accounting for just the autocorrelative process without weights or utilizing weights while ignoring the autocorrelative process may lead to bias in the sample estimates particularly in large-scale datasets in which these two sources of error are inherently embedded. The mean intercept and mean slope of latent curve models without weights was consistently underestimated when fitted to data sets having AR, MA or ARMA process. On the other hand, the intercept variance, intercept slope, and intercept slope correlation were overestimated for latent curve models with weights. However, these three estimates were not accurate as the standard errors associated with them were high. In addition, fit indices, AR and MA estimates, parsimony of the model, behavior of sample latent curve estimates, and interaction effects between autocorrelative processes and sample weights should be assessed for all the models before a particular model is deemed as most appropriate. If the AR estimate is high and MA estimate is low for a LCAR model than the other models that are fitted to a data set having sample weights and the fit indices are in the acceptable cut-off range, then the data set has a higher likelihood of having an AR process between the observations. If the MA estimate is high and AR estimate is low for a LCMA model than the other models that are fitted to a data set having sample weights and the fit indices are in the acceptable cut-off range, then the data set has a higher likelihood of having an MA process between the observations. If both AR and MA estimates are high for a LCARMA model than the other models that are fitted to a data set having sample weights and the fit indices are in the acceptable cut-off range, then the data set has a higher likelihood of having an ARMA process between the observations. The results from the current study recommends that biases from both autocorrelation and sample weights needs to be simultaneously modeled to obtain accurate estimates. The type of autocorrelation (AR, MA or ARMA), magnitude of autocorrelation, and sample weights influences the behavior of estimates and all the three facets should be carefully considered to correctly interpret the estimates especially in the context of measuring growth or change in the variable(s) of interest over time in large-scale longitudinal panel data sets.
Show less
-
Date Issued
-
2015
-
Identifier
-
CFE0005914, ucf:50850
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0005914
-
-
Title
-
THE DEVELOPMENT OF THE UNIVERSITY OF CENTRAL FLORIDA HOME MOVIE ARCHIVE AND THE HARRIS ROSEN COLLECTION.
-
Creator
-
Niedermeyer, Michael, Gordon, Fon, University of Central Florida
-
Abstract / Description
-
Since the invention of the cinema, people have been taking home movies. The ever increasing popularity of this activity has produced a hundred years worth of amateur film culture which is in desperate need of preservation. As film archival and public history have coalesced in the past thirty years around the idea that every personÃÂ's history is important, home movies represent a way for those histories to be preserved and studied by communities and researchers alike....
Show moreSince the invention of the cinema, people have been taking home movies. The ever increasing popularity of this activity has produced a hundred years worth of amateur film culture which is in desperate need of preservation. As film archival and public history have coalesced in the past thirty years around the idea that every personÃÂ's history is important, home movies represent a way for those histories to be preserved and studied by communities and researchers alike. The University of Central Florida is in a perfect position to establish an archive of this nature, one that is specifically dedicated to acquiring, preserving, and presenting the home movies of Central Florida residents. This project has resulted in the establishment of The Central Florida Home Movie Archive, and the resulting analysis will show that the archive will be a benefit for researchers from all areas of academic study as well as the residents of Central Florida.
Show less
-
Date Issued
-
2010
-
Identifier
-
CFE0003432, ucf:48410
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0003432
-
-
Title
-
Near-road Dispersion Modeling of Mobile Source Air Toxics (MSATs) in Florida.
-
Creator
-
Westerlund, Kurt, Cooper, Charles, Radwan, Ahmed, Randall, Andrew, Hall, Steven, University of Central Florida
-
Abstract / Description
-
There is a growing public concern that emissions of mobile source air toxics (MSATs) from motor vehicles may pose a threat to human health. At present, no state or federal agencies require dispersion modeling of these compounds, but many agencies are concerned about potential future requirements. Current air pollution professionals are familiar with Federal Highway Administration (FHWA) and U.S. Environmental Protection Agency (EPA) requirements for dispersion modeling to produce predicted...
Show moreThere is a growing public concern that emissions of mobile source air toxics (MSATs) from motor vehicles may pose a threat to human health. At present, no state or federal agencies require dispersion modeling of these compounds, but many agencies are concerned about potential future requirements. Current air pollution professionals are familiar with Federal Highway Administration (FHWA) and U.S. Environmental Protection Agency (EPA) requirements for dispersion modeling to produce predicted concentrations for comparison with appropriate standards. This research examined a method in which the potential near-road concentrations of MSATs were calculated. It was believed that by assessing MSATs in much the same way that are used for other pollutants, the model and methods developed in this research could become a standard for those quantifying MSAT concentrations near-roadways.This dissertation reports on the results from short-term (1-hour) and long-term (annual average) MSATs dispersion modeling that has been conducted on seven intersections and seven freeway segments in the state of Florida. To accomplish the modeling, the CAL3QHC model was modified to handle individual MSAT emissions input data and to predict the concentrations of several MSATs around these roadway facilities. Additionally, since the CAL3MSAT model is DOS based and not user-friendly, time was invested to develop a Windows(&)#174; graphical user interface (GUI). Real-world data (traffic volumes and site geometry) were gathered, worst-case meteorology was selected, mobile source emission factors (EFs) were obtained from MOVES2010a, and worst-case modeling was conducted. Based on a literature search, maximum acceptable concentrations (MACs) were proposed for comparison with the modeled results, for both a short-term (1-hour) averaging time and a long-term (1-year) averaging time.Results from this CAL3MSAT modeling study indicate that for all of the intersections and freeway segments, the worst-case 1-hour modeled concentrations of the MSATs were several orders of magnitude below the proposed short-term MACs. The worst-case 1-year modeled concentrations were of the same order of magnitude as the proposed long-term MACs.The 1-year concentrations were first developed by applying a persistence factor to the worst-case 1-hour concentrations. In the interest of comparing the predicted concentrations from the CAL3MSAT persistence factor approach to other dispersion models, two EPA regulatory models (CAL3QHCR and AERMOD) with the ability to account for yearly meteorology, traffic, and signal timing were used. Both hourly and annual MSAT concentrations were predicted at one large urban intersection and compared for the three different dispersion models. The short-term 1-hour results from CAL3MSAT were higher than those predicted by the two other models due to the worst-case assumptions. Similarly, results indicate that the CAL3MSAT persistence factor approach predicted a worst-case annual average concentration on the same order of magnitude as the two other more refined models. This indicated that the CAL3MSAT model might be useful as a worst-case screening approach.
Show less
-
Date Issued
-
2013
-
Identifier
-
CFE0004772, ucf:49804
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0004772
-
-
Title
-
Environmental Study of Solid Waste Collection.
-
Creator
-
Maimoun, Mousa, Reinhart, Debra, Mccauley Bush, Pamela, Cooper, Charles, University of Central Florida
-
Abstract / Description
-
The growing municipal solid waste generation rates have necessitated more efficient, optimized waste collection facilities. The majority of the US collection fleet is composed of diesel-fueled vehicles which contribute significant atmospheric emissions including greenhouse gases. In order to reduce emissions to the atmosphere, more collection agencies are investigating alternative fuel technologies such as natural gas, biofuels (bio-gas and bio-diesel), and hybrid electric technology. This...
Show moreThe growing municipal solid waste generation rates have necessitated more efficient, optimized waste collection facilities. The majority of the US collection fleet is composed of diesel-fueled vehicles which contribute significant atmospheric emissions including greenhouse gases. In order to reduce emissions to the atmosphere, more collection agencies are investigating alternative fuel technologies such as natural gas, biofuels (bio-gas and bio-diesel), and hybrid electric technology. This research is an in-depth environmental analysis of potential alternative fuel technologies for waste collection vehicles.This study will evaluate the use of alternative fuels by waste collection vehicles. Life-cycle emissions, cost, fuel and energy consumption were evaluated for a wide range of fossil and bio-fuel technologies. Moreover, the energy consumption and the tail-pipe emissions of diesel-fueled waste collection vehicles were estimated using MOVES 2010a software. Emission factors were calculated for a typical waste collection driving cycle as well as constant speed. Finally, the selection of fuel type by the waste collection industry requires consideration of environmental, security, financial, operational, and safety issues. In this study, a qualitative comparison between alternative fuels was performed; a multifactorial assessment of these factors was conducted taking into account the opinion of the waste collection industry of the importance of each factor.Liquid-petroleum fuels have higher life-cycle emissions compared to natural gas; however landfill natural gas has the lowest life-cycle emissions compared to all other fuel categories. Compressed natural gas waste collection vehicles have the lowest fuel cost per collection vehicle mile travel compared to other fuel categories. Moreover, the actual driving cycle of waste collection vehicles consists of repetitive stops and starts during waste collection; this generates more emissions than constant speed driving. Finally, the multifactorial assessment indicates that natural gas and landfill gas have better environmental, economical, and energy security performance than current liquid-petroleum fuels.
Show less
-
Date Issued
-
2011
-
Identifier
-
CFE0004133, ucf:49115
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0004133
-
-
Title
-
EXPLOITING OPPONENT MODELING FOR LEARNING IN MULTI-AGENT ADVERSARIAL GAMES.
-
Creator
-
Laviers, Kennard, Sukthankar, Gita, University of Central Florida
-
Abstract / Description
-
An issue with learning effective policies in multi-agent adversarial games is that the size of the search space can be prohibitively large when the actions of both teammates and opponents are considered simultaneously. Opponent modeling, predicting an opponent's actions in advance of execution, is one approach for selecting actions in adversarial settings, but it is often performed in an ad hoc way. In this dissertation, we introduce several methods for using opponent modeling, in the form of...
Show moreAn issue with learning effective policies in multi-agent adversarial games is that the size of the search space can be prohibitively large when the actions of both teammates and opponents are considered simultaneously. Opponent modeling, predicting an opponent's actions in advance of execution, is one approach for selecting actions in adversarial settings, but it is often performed in an ad hoc way. In this dissertation, we introduce several methods for using opponent modeling, in the form of predictions about the players' physical movements, to learn team policies. To explore the problem of decision-making in multi-agent adversarial scenarios, we use our approach for both offline play generation and real-time team response in the Rush 2008 American football simulator. Simultaneously predicting the movement trajectories, future reward, and play strategies of multiple players in real-time is a daunting task but we illustrate how it is possible to divide and conquer this problem with an assortment of data-driven models. By leveraging spatio-temporal traces of player movements, we learn discriminative models of defensive play for opponent modeling. With the reward information from previous play matchups, we use a modified version of UCT (Upper Conference Bounds applied to Trees) to create new offensive plays and to learn play repairs to counter predicted opponent actions. In team games, players must coordinate effectively to accomplish tasks while foiling their opponents either in a preplanned or emergent manner. An effective team policy must generate the necessary coordination, yet considering all possibilities for creating coordinating subgroups is computationally infeasible. Automatically identifying and preserving the coordination between key subgroups of teammates can make search more productive by pruning policies that disrupt these relationships. We demonstrate that combining opponent modeling with automatic subgroup identification can be used to create team policies with a higher average yardage than either the baseline game or domain-specific heuristics.
Show less
-
Date Issued
-
2011
-
Identifier
-
CFE0003914, ucf:48720
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0003914
-
-
Title
-
INTELLIGENT DESIGN.
-
Creator
-
Dudziak, Jillian, Poindexter, Carla, University of Central Florida
-
Abstract / Description
-
As human beings we are designed and created in a fabric that is profound and complex. We are built with a framework where mind and body work in a concerted effort to maintain our lives automatically. A deep and defining part of our existence as humans is not just the innate desire to live but to live in consistent well-being—emotionally, physically, and mentally. I believe when we incorporate our knowledge of human physiology into our creative process then we allow ourselves a greater...
Show moreAs human beings we are designed and created in a fabric that is profound and complex. We are built with a framework where mind and body work in a concerted effort to maintain our lives automatically. A deep and defining part of our existence as humans is not just the innate desire to live but to live in consistent well-being—emotionally, physically, and mentally. I believe when we incorporate our knowledge of human physiology into our creative process then we allow ourselves a greater opportunity to create an authentic connection with our intended audience. My work during the past three years has been rooted in the study of these philosophical and scientific principles. I created a series of visual experimentations that aim to assist in my understanding of human beings at an emotional and biological level. Armed with a deep desire to understand humanity, my goal is to create work that fosters positive change and has significant impact in the world. My past and present research has been focused on human emotions, the intuitive creative process and the relationship between technology and establishing social identity.
Show less
-
Date Issued
-
2011
-
Identifier
-
CFE0003693, ucf:48844
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0003693
-
-
Title
-
A MODEL INTEGRATED MESHLESS SOLVER (MIMS) FOR FLUID FLOW AND HEAT TRANSFER.
-
Creator
-
Gerace, Salvadore, Kassab, Alain, University of Central Florida
-
Abstract / Description
-
Numerical methods for solving partial differential equations are commonplace in the engineering community and their popularity can be attributed to the rapid performance improvement of modern workstations and desktop computers. The ubiquity of computer technology has allowed all areas of engineering to have access to detailed thermal, stress, and fluid flow analysis packages capable of performing complex studies of current and future designs. The rapid pace of computer development, however,...
Show moreNumerical methods for solving partial differential equations are commonplace in the engineering community and their popularity can be attributed to the rapid performance improvement of modern workstations and desktop computers. The ubiquity of computer technology has allowed all areas of engineering to have access to detailed thermal, stress, and fluid flow analysis packages capable of performing complex studies of current and future designs. The rapid pace of computer development, however, has begun to outstrip efforts to reduce analysis overhead. As such, most commercially available software packages are now limited by the human effort required to prepare, develop, and initialize the necessary computational models. Primarily due to the mesh-based analysis methods utilized in these software packages, the dependence on model preparation greatly limits the accessibility of these analysis tools. In response, the so-called meshless or mesh-free methods have seen considerable interest as they promise to greatly reduce the necessary human interaction during model setup. However, despite the success of these methods in areas demanding high degrees of model adaptability (such as crack growth, multi-phase flow, and solid friction), meshless methods have yet to gain notoriety as a viable alternative to more traditional solution approaches in general solution domains. Although this may be due (at least in part) to the relative youth of the techniques, another potential cause is the lack of focus on developing robust methodologies. The failure to approach development from a practical perspective has prevented researchers from obtaining commercially relevant meshless methodologies which reach the full potential of the approach. The primary goal of this research is to present a novel meshless approach called MIMS (Model Integrated Meshless Solver) which establishes the method as a generalized solution technique capable of competing with more traditional PDE methodologies (such as the finite element and finite volume methods). This was accomplished by developing a robust meshless technique as well as a comprehensive model generation procedure. By closely integrating the model generation process into the overall solution methodology, the presented techniques are able to fully exploit the strengths of the meshless approach to achieve levels of automation, stability, and accuracy currently unseen in the area of engineering analysis. Specifically, MIMS implements a blended meshless solution approach which utilizes a variety of shape functions to obtain a stable and accurate iteration process. This solution approach is then integrated with a newly developed, highly adaptive model generation process which employs a quaternary triangular surface discretization for the boundary, a binary-subdivision discretization for the interior, and a unique shadow layer discretization for near-boundary regions. Together, these discretization techniques are able to achieve directionally independent, automatic refinement of the underlying model, allowing the method to generate accurate solutions without need for intermediate human involvement. In addition, by coupling the model generation with the solution process, the presented method is able to address the issue of ill-constructed geometric input (small features, poorly formed faces, etc.) to provide an intuitive, yet powerful approach to solving modern engineering analysis problems.
Show less
-
Date Issued
-
2010
-
Identifier
-
CFE0003299, ucf:48489
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0003299
-
-
Title
-
Microscopic Assessment of Transportation Emissions on Limited Access Highways.
-
Creator
-
Abou-Senna, Hatem, Radwan, Ahmed, Abdel-Aty, Mohamed, Al-Deek, Haitham, Cooper, Charles, Johnson, Mark, University of Central Florida
-
Abstract / Description
-
On-road vehicles are a major source of transportation carbon dioxide (CO2) greenhouse gas emissions in all the developed countries, and in many of the developing countries in the world. Similarly, several criteria air pollutants are associated with transportation, e.g., carbon monoxide (CO), nitrogen oxides (NOx), and particulate matter (PM). The need to accurately quantify transportation-related emissions from vehicles is essential. Transportation agencies and researchers in the past have...
Show moreOn-road vehicles are a major source of transportation carbon dioxide (CO2) greenhouse gas emissions in all the developed countries, and in many of the developing countries in the world. Similarly, several criteria air pollutants are associated with transportation, e.g., carbon monoxide (CO), nitrogen oxides (NOx), and particulate matter (PM). The need to accurately quantify transportation-related emissions from vehicles is essential. Transportation agencies and researchers in the past have estimated emissions using one average speed and volume on a long stretch of roadway. With MOVES, there is an opportunity for higher precision and accuracy. Integrating a microscopic traffic simulation model (such as VISSIM) with MOVES allows one to obtain precise and accurate emissions estimates. The new United States Environmental Protection Agency (USEPA) mobile source emissions model, MOVES2010a (MOVES) can estimate vehicle emissions on a second-by-second basis creating the opportunity to develop new software (")VIMIS 1.0(") (VISSIM/MOVES Integration Software) to facilitate the integration process. This research presents a microscopic examination of five key transportation parameters (traffic volume, speed, truck percentage, road grade and temperature) on a 10-mile stretch of Interstate 4 (I-4) test bed prototype; an urban limited access highway corridor in Orlando, Florida. The analysis was conducted utilizing VIMIS 1.0 and using an advanced custom design technique; D-Optimality and I-Optimality criteria, to identify active factors and to ensure precision in estimating the regression coefficients as well as the response variable.The analysis of the experiment identified the optimal settings of the key factors and resulted in the development of Micro-TEM (Microscopic Transportation Emissions Meta-Model). The main purpose of Micro-TEM is to serve as a substitute model for predicting transportation emissions on limited access highways to an acceptable degree of accuracy in lieu of running simulations using a traffic model and integrating the results in an emissions model. Furthermore, significant emission rate reductions were observed from the experiment on the modeled corridor especially for speeds between 55 and 60 mph while maintaining up to 80% and 90% of the freeway's capacity. However, vehicle activity characterization in terms of speed was shown to have a significant impact on the emission estimation approach.Four different approaches were further examined to capture the environmental impacts of vehicular operations on the modeled test bed prototype. First, (at the most basic level), emissions were estimated for the entire 10-mile section (")by hand(") using one average traffic volume and average speed. Then, three advanced levels of detail were studied using VISSIM/MOVES to analyze smaller links: average speeds and volumes (AVG), second-by-second link driving schedules (LDS), and second-by-second operating mode distributions (OPMODE). This research analyzed how the various approaches affect predicted emissions of CO, NOx, PM and CO2. The results demonstrated that obtaining accurate and comprehensive operating mode distributions on a second-by-second basis improves emission estimates. Specifically, emission rates were found to be highly sensitive to stop-and-go traffic and the associated driving cycles of acceleration, deceleration, frequent braking/coasting and idling. Using the AVG or LDS approach may overestimate or underestimate emissions, respectively, compared to an operating mode distribution approach.Additionally, model applications and mitigation scenarios were examined on the modeled corridor to evaluate the environmental impacts in terms of vehicular emissions and at the same time validate the developed model (")Micro-TEM("). Mitigation scenarios included the future implementation of managed lanes (ML) along with the general use lanes (GUL) on the I-4 corridor, the currently implemented variable speed limits (VSL) scenario as well as a hypothetical restricted truck lane (RTL) scenario. Results of the mitigation scenarios showed an overall speed improvement on the corridor which resulted in overall reduction in emissions and emission rates when compared to the existing condition (EX) scenario and specifically on link by link basis for the RTL scenario.The proposed emission rate estimation process also can be extended to gridded emissions for ozone modeling, or to localized air quality dispersion modeling, where temporal and spatial resolution of emissions is essential to predict the concentration of pollutants near roadways.
Show less
-
Date Issued
-
2012
-
Identifier
-
CFE0004777, ucf:49788
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0004777
-
-
Title
-
Robust Subspace Estimation Using Low-Rank Optimization. Theory and Applications in Scene Reconstruction, Video Denoising, and Activity Recognition.
-
Creator
-
Oreifej, Omar, Shah, Mubarak, Da Vitoria Lobo, Niels, Stanley, Kenneth, Lin, Mingjie, Li, Xin, University of Central Florida
-
Abstract / Description
-
In this dissertation, we discuss the problem of robust linear subspace estimation using low-rank optimization and propose three formulations of it. We demonstrate how these formulations can be used to solve fundamental computer vision problems, and provide superior performance in terms of accuracy and running time.Consider a set of observations extracted from images (such as pixel gray values, local features, trajectories...etc). If the assumption that these observations are drawn from a...
Show moreIn this dissertation, we discuss the problem of robust linear subspace estimation using low-rank optimization and propose three formulations of it. We demonstrate how these formulations can be used to solve fundamental computer vision problems, and provide superior performance in terms of accuracy and running time.Consider a set of observations extracted from images (such as pixel gray values, local features, trajectories...etc). If the assumption that these observations are drawn from a liner subspace (or can be linearly approximated) is valid, then the goal is to represent each observation as a linear combination of a compact basis, while maintaining a minimal reconstruction error. One of the earliest, yet most popular, approaches to achieve that is Principal Component Analysis (PCA). However, PCA can only handle Gaussian noise, and thus suffers when the observations are contaminated with gross and sparse outliers. To this end, in this dissertation, we focus on estimating the subspace robustly using low-rank optimization, where the sparse outliers are detected and separated through the `1 norm. The robust estimation has a two-fold advantage: First, the obtained basis better represents the actual subspace because it does not include contributions from the outliers. Second, the detected outliers are often of a specific interest in many applications, as we will show throughout this thesis. We demonstrate four different formulations and applications for low-rank optimization. First, we consider the problem of reconstructing an underwater sequence by removing the turbulence caused by the water waves. The main drawback of most previous attempts to tackle this problem is that they heavily depend on modelling the waves, which in fact is ill-posed since the actual behavior of the waves along with the imaging process are complicated and include several noise components; therefore, their results are not satisfactory. In contrast, we propose a novel approach which outperforms the state-of-the-art. The intuition behind our method is that in a sequence where the water is static, the frames would be linearly correlated. Therefore, in the presence of water waves, we may consider the frames as noisy observations drawn from a the subspace of linearly correlated frames. However, the noise introduced by the water waves is not sparse, and thus cannot directly be detected using low-rank optimization. Therefore, we propose a data-driven two-stage approach, where the first stage (")sparsifies(") the noise, and the second stage detects it. The first stage leverages the temporal mean of the sequence to overcome the structured turbulence of the waves through an iterative registration algorithm. The result of the first stage is a high quality mean and a better structured sequence; however, the sequence still contains unstructured sparse noise. Thus, we employ a second stage at which we extract the sparse errors from the sequence through rank minimization. Our method converges faster, and drastically outperforms state of the art on all testing sequences. Secondly, we consider a closely related situation where an independently moving object is also present in the turbulent video. More precisely, we consider video sequences acquired in a desert battlefields, where atmospheric turbulence is typically present, in addition to independently moving targets. Typical approaches for turbulence mitigation follow averaging or de-warping techniques. Although these methods can reduce the turbulence, they distort the independently moving objects which can often be of great interest. Therefore, we address the problem of simultaneous turbulence mitigation and moving object detection. We propose a novel three-term low-rank matrix decomposition approach in which we decompose the turbulence sequence into three components: the background, the turbulence, and the object. We simplify this extremely difficult problem into a minimization of nuclear norm, Frobenius norm, and L1 norm. Our method is based on two observations: First, the turbulence causes dense and Gaussian noise, and therefore can be captured by Frobenius norm, while the moving objects are sparse and thus can be captured by L1 norm. Second, since the object's motion is linear and intrinsically different than the Gaussian-like turbulence, a Gaussian-based turbulence model can be employed to enforce an additional constraint on the search space of the minimization. We demonstrate the robustness of our approach on challenging sequences which are significantly distorted with atmospheric turbulence and include extremely tiny moving objects. In addition to robustly detecting the subspace of the frames of a sequence, we consider using trajectories as observations in the low-rank optimization framework. In particular, in videos acquired by moving cameras, we track all the pixels in the video and use that to estimate the camera motion subspace. This is particularly useful in activity recognition, which typically requires standard preprocessing steps such as motion compensation, moving object detection, and object tracking. The errors from the motion compensation step propagate to the object detection stage, resulting in miss-detections, which further complicates the tracking stage, resulting in cluttered and incorrect tracks. In contrast, we propose a novel approach which does not follow the standard steps, and accordingly avoids the aforementioned difficulties. Our approach is based on Lagrangian particle trajectories which are a set of dense trajectories obtained by advecting optical flow over time, thus capturing the ensemble motions of a scene. This is done in frames of unaligned video, and no object detection is required. In order to handle the moving camera, we decompose the trajectories into their camera-induced and object-induced components. Having obtained the relevant object motion trajectories, we compute a compact set of chaotic invariant features, which captures the characteristics of the trajectories. Consequently, a SVM is employed to learn and recognize the human actions using the computed motion features. We performed intensive experiments on multiple benchmark datasets, and obtained promising results.Finally, we consider a more challenging problem referred to as complex event recognition, where the activities of interest are complex and unconstrained. This problem typically pose significant challenges because it involves videos of highly variable content, noise, length, frame size ... etc. In this extremely challenging task, high-level features have recently shown a promising direction as in [53, 129], where core low-level events referred to as concepts are annotated and modeled using a portion of the training data, then each event is described using its content of these concepts. However, because of the complex nature of the videos, both the concept models and the corresponding high-level features are significantly noisy. In order to address this problem, we propose a novel low-rank formulation, which combines the precisely annotated videos used to train the concepts, with the rich high-level features. Our approach finds a new representation for each event, which is not only low-rank, but also constrained to adhere to the concept annotation, thus suppressing the noise, and maintaining a consistent occurrence of the concepts in each event. Extensive experiments on large scale real world dataset TRECVID Multimedia Event Detection 2011 and 2012 demonstrate that our approach consistently improves the discriminativity of the high-level features by a significant margin.
Show less
-
Date Issued
-
2013
-
Identifier
-
CFE0004732, ucf:49835
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0004732