Current Search: engineering (x)
View All Items
Pages
- Title
- An Engineering Analytics Based Framework for Computational Advertising Systems.
- Creator
-
Chen, Mengmeng, Rabelo, Luis, Lee, Gene, Keathley, Heather, Rahal, Ahmad, University of Central Florida
- Abstract / Description
-
Engineering analytics is a multifaceted landscape with a diversity of analytics tools which comes from emerging fields such as big data, machine learning, and traditional operations research. Industrial engineering is capable to optimize complex process and systems using engineering analytics elements and the traditional components such as total quality management. This dissertation has proven that industrial engineering using engineering analytics can optimize the emerging area of...
Show moreEngineering analytics is a multifaceted landscape with a diversity of analytics tools which comes from emerging fields such as big data, machine learning, and traditional operations research. Industrial engineering is capable to optimize complex process and systems using engineering analytics elements and the traditional components such as total quality management. This dissertation has proven that industrial engineering using engineering analytics can optimize the emerging area of Computational Advertising. The key was to know the different fields very well and do the right selection. However, people first need to understand and be experts in the flow of the complex application of Computational Advertising and based on the characteristics of each step map the right field of Engineering analytics and traditional Industrial Engineering. Then build the apparatus and apply it to the respective problem in question.This dissertation consists of four research papers addressing the development of a framework to tame the complexity of computational advertising and improve its usage efficiency from an advertiser's viewpoint. This new framework and its respective systems architecture combine the use of support vector machines, Recurrent Neural Networks, Deep Learning Neural Networks, traditional neural networks, Game Theory/Auction Theory with Generative adversarial networks, and Web Engineering to optimize the computational advertising bidding process and achieve a higher rate of return. The system is validated with an actual case study with commercial providers such as Google AdWords and an advertiser's budget of several million dollars.
Show less - Date Issued
- 2018
- Identifier
- CFE0007319, ucf:52118
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007319
- Title
- Systems Geometry: A Methodology for Analyzing Emergent System of Systems Behaviors.
- Creator
-
Bouwens, Christina, Sepulveda, Jose, Karwowski, Waldemar, Xanthopoulos, Petros, Kapucu, Naim, University of Central Florida
- Abstract / Description
-
Recent advancements in technology have led to the increased use of integrated 'systems of systems' (SoS) which link together independently developed and usable capabilities into an integrated system that exhibits new, emergent capabilities. However, the resulting SoS is often not well understood, where secondary and tertiary effects of tying systems together are often unpredictable and present severe consequences. The complexities of the composed system stem not only from system integration,...
Show moreRecent advancements in technology have led to the increased use of integrated 'systems of systems' (SoS) which link together independently developed and usable capabilities into an integrated system that exhibits new, emergent capabilities. However, the resulting SoS is often not well understood, where secondary and tertiary effects of tying systems together are often unpredictable and present severe consequences. The complexities of the composed system stem not only from system integration, but from a broad range of areas such as the competing objectives of different constituent system stakeholders, mismatched requirements from multiple process models, and architectures and interface approaches that are incompatible on multiple levels. While successful SoS development has proven to be a valuable tool for a wide range of applications, there are significant problems that remain with the development of such systems that need to be addressed during the early stages of engineering development within such environments. The purpose of this research is to define and demonstrate a methodology called Systems Geometry (SG) for analyzing SoS in the early stages of development to identify areas of potential unintended emergent behaviors as candidates for the employment of risk management strategies. SG focuses on three dimensions of interest when planning the development of a SoS: operational, functional, and technical. For Department of Defense (DoD) SoS, the operational dimension addresses the warfighter environment and includes characteristics such as mission threads and related command and control or simulation activities required to support the mission. The functional dimension highlights different roles associated with the development and use of the SoS, which could include a participant warfighter using the system, an analyst collecting data for system evaluation, or an infrastructure engineer working to keep the SoS infrastructure operational to support the users. Each dimension can be analyzed to understand roles, interfaces and activities. Cross-dimensional effects are of particular interest since such effects are less detectable and generally not addressed with conventional systems engineering (SE) methods. The literature review and the results of this study have identified key characteristics or dimensions that should be examined during SoS analysis and design. Although many methods exist for exploring system dimensions, there is a gap in techniques to explore cross-dimensional interactions and their effect on emergent SoS behaviors. The study has resulted in a methodology for capturing dimensional information and recommended analytical methods for intra-dimensional as well as cross-dimensional analysis. A problem-based approach to the system analysis is recommended combined with the application of matrix methods, network analysis and modeling techniques to provide intra- and cross-dimensional insight. The results of this research are applicable to a variety of socio-technical SoS analyses with applications in analysis, experimentation, test and evaluation and training.
Show less - Date Issued
- 2013
- Identifier
- CFE0005135, ucf:50696
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005135
- Title
- A New Multidimensional Psycho-Physical Framework for Modeling Car-Following in a Freeway Work Zone.
- Creator
-
Lochrane, Taylor, Al-Deek, Haitham, Radwan, Essam, Oloufa, Amr, Harb, Rami, Uddin, Nizam, University of Central Florida
- Abstract / Description
-
As the United States continues to build and repair the ageing highway infrastructure, the bearing of freeway work zones will continue to impact the capacity. To predict the capacity of a freeway work zone, there are several tools available for engineers to evaluate these work zones but only microsimulation has the ability to simulate the driver behavior. One of the limitations of current car-following models is that they only account for one overall behavioral condition. This dissertation...
Show moreAs the United States continues to build and repair the ageing highway infrastructure, the bearing of freeway work zones will continue to impact the capacity. To predict the capacity of a freeway work zone, there are several tools available for engineers to evaluate these work zones but only microsimulation has the ability to simulate the driver behavior. One of the limitations of current car-following models is that they only account for one overall behavioral condition. This dissertation hypothesizes that drivers change their driving behavior as they drive through a freeway work zone compared to normal freeway conditions which has the potential to impact traffic operations and capacity of work zones. Psycho-physical car-following models are widely used in practice for simulating car-following. However, current simulation models may not fully capture car-following driver behavior specific to freeway work zones. This dissertation presents a new multidimensional psycho-physical framework for modeling car-following based on statistical evaluation of work zone and non-work zone driver behavior. This new framework is close in character to the Wiedemann model used in popular traffic simulation software such as VISSIM. This dissertation used two methodologies for collecting data: (1) a questionnaire to collect demographics and work zone behavior data and (2) a real-time vehicle data from a field experiment involving human participants. It is hypothesized that the parameters needed to calibrate the multidimensional framework for work zone driver behavior can be derived statistically by using data collected from runs of an Instrumented Research Vehicle (IRV) in a Living Laboratory (LL) along a roadway. The design of this LL included the development of an Instrumented Research Vehicle (IRV) to capture the natural car-following response of a driver when entering and passing through a freeway work zone. The development of a Connected Mobile Traffic Sensing (CMTS) system, which included state-of-the-art ITS technologies, supports the LL environment by providing the connectivity, interoperability and data processing of the natural, real-life setting. The IRV and CMTS system are tools designed to support the concept of a LL which facilitates the experimental environment to capture and calibrate natural driver behavior. The objective is to have these participants drive the instrumented vehicle and collect the relative distance and the relative velocity between the instrumented vehicle and the vehicle in the front of the instrumented vehicle. A Phase I pilot test was conducted with 10 participants to evaluate the experiment and make any adjustments prior to the full Phase II driver test. The Phase II driver test recruited a group of 64 participants to drive the IRV through an LL set up along a work zone on I-95 near Washington D.C. in order to validate this hypothesis In this dissertation, a new framework was applied and it demonstrated that there are four different categories of car-following behavior models each with different parameter distributions. The four categories are divided by traffic condition (congested vs. uncongested) and by roadway condition (work zone vs. non-work zone). The calibrated threshold values are presented for each of these four categories. By applying this new multidimensional framework, modeling of car-following behavior can enhance vehicle behavior in microsimulation modeling.This dissertation also explored driver behavior through combining vehicle data and survey techniques to augment the model calibrations to improve the understanding of car-following behavior in freeway work zones. The results identify a set of survey questions that can potentially guide the selection of parameters for car-fallowing models. The findings presented in this dissertation can be used to improve the performance of driver behavior models specific to work zones. This in return will more acutely forecast the impact a work zone design has on capacity during congestion.
Show less - Date Issued
- 2014
- Identifier
- CFE0005521, ucf:50326
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005521
- Title
- A LIFE CYCLE SOFTWARE QUALITY MODEL USING BAYESIAN BELIEF NETWORKS.
- Creator
-
Beaver, Justin, Schiavone, Guy, University of Central Florida
- Abstract / Description
-
Software practitioners lack a consistent approach to assessing and predicting quality within their products. This research proposes a software quality model that accounts for the influences of development team skill/experience, process maturity, and problem complexity throughout the software engineering life cycle. The model is structured using Bayesian Belief Networks and, unlike previous efforts, uses widely-accepted software engineering standards and in-use industry techniques to quantify...
Show moreSoftware practitioners lack a consistent approach to assessing and predicting quality within their products. This research proposes a software quality model that accounts for the influences of development team skill/experience, process maturity, and problem complexity throughout the software engineering life cycle. The model is structured using Bayesian Belief Networks and, unlike previous efforts, uses widely-accepted software engineering standards and in-use industry techniques to quantify the indicators and measures of software quality. Data from 28 software engineering projects was acquired for this study, and was used for validation and comparison of the presented software quality models. Three Bayesian model structures are explored and the structure with the highest performance in terms of accuracy of fit and predictive validity is reported. In addition, the Bayesian Belief Networks are compared to both Least Squares Regression and Neural Networks in order to identify the technique is best suited to modeling software product quality. The results indicate that Bayesian Belief Networks outperform both Least Squares Regression and Neural Networks in terms of producing modeled software quality variables that fit the distribution of actual software quality values, and in accurately forecasting 25 different indicators of software quality. Between the Bayesian model structures, the simplest structure, which relates software quality variables to their correlated causal factors, was found to be the most effective in modeling software quality. In addition, the results reveal that the collective skill and experience of the development team, over process maturity or problem complexity, has the most significant impact on the quality of software products.
Show less - Date Issued
- 2006
- Identifier
- CFE0001367, ucf:46993
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001367
- Title
- THE EXPLORATION OF ROTATING DETONATION DYNAMICS INCORPORATING A COAL-BASED FUEL MIXTURE.
- Creator
-
Rogan, John P., Ahmed, Kareem, Bhattacharya, Samik, University of Central Florida
- Abstract / Description
-
This investigation explores the detonation dynamics of a rotating detonation engine (RDE). Beginning with the general understanding and characteristics of hydrogen and compressed air as a detonation fuel source, this study further develops the experimental approach to incorporating a coal-based fuel mixture in an RDE. There is insufficient prior research investigating the use of coal as part of a fuel mixture and insignificant progress being made to improve thermal efficiency with...
Show moreThis investigation explores the detonation dynamics of a rotating detonation engine (RDE). Beginning with the general understanding and characteristics of hydrogen and compressed air as a detonation fuel source, this study further develops the experimental approach to incorporating a coal-based fuel mixture in an RDE. There is insufficient prior research investigating the use of coal as part of a fuel mixture and insignificant progress being made to improve thermal efficiency with deflagration. The U.S. Department of Energy's Office of Fossil Energy awarded the Propulsion and Energy Research Laboratory at the University of Central Florida a grant to lead the investigation on the feasibility of using a coal-based fuel mixture to power rotating detonation engines. Through this study, the developmental and experimental understanding of RDEs has been documented, operability maps have been plotted, and the use of a coal-based fuel mixture in an RDE has been explored. The operability of hydrogen and compressed air has been found, a normalization of all operable space has been developed, and there is evidence indicating coal can be used as part of a fuel mixture to detonate an RDE. The study will continue to investigate coal's use in an RDE. As the most abundant fossil fuel on earth, coal is a popular fuel source in deflagrative combustion for electrical power generation. This study investigates how the combustion of coal can become significantly more efficient.
Show less - Date Issued
- 2018
- Identifier
- CFH2000437, ucf:45741
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFH2000437
- Title
- TISSUE ENGINEERED MYELINATION AND THE STRETCH REFLEX ARC SENSORY CIRCUIT: DEFINED MEDIUM FORMULATION, INTERFACE DESIGN AND MICROFABRICATION.
- Creator
-
Rumsey, John, Hickman, James, University of Central Florida
- Abstract / Description
-
The overall focus of this research project was to develop an in vitro tissue-engineered system that accurately reproduced the physiology of the sensory elements of the stretch reflex arc as well as engineer the myelination of neurons in the systems. In order to achieve this goal we hypothesized that myelinating culture systems, intrafusal muscle fibers and the sensory circuit of the stretch reflex arc could be bioengineered using serum-free medium formulations, growth substrate interface...
Show moreThe overall focus of this research project was to develop an in vitro tissue-engineered system that accurately reproduced the physiology of the sensory elements of the stretch reflex arc as well as engineer the myelination of neurons in the systems. In order to achieve this goal we hypothesized that myelinating culture systems, intrafusal muscle fibers and the sensory circuit of the stretch reflex arc could be bioengineered using serum-free medium formulations, growth substrate interface design and microfabrication technology. The monosynaptic stretch reflex arc is formed by a direct synapse between motoneurons and sensory neurons and is one of the fundamental circuits involved in motor control. The circuit serves as a proprioceptive feedback system, relaying information about muscle length and stretch to the central nervous system (CNS). It is composed of four elements, which are split into two circuits. The efferent or motor circuit is composed of an α-motoneuron and the extrafusal skeletal muscle fibers it innervates, while the afferent or sensory circuit is composed of a Ia sensory neuron and a muscle spindle. Structurally, the two muscular units are aligned in parallel, which plays a critical role modulating the system's performance. Functionally, the circuit acts to maintain appropriate muscle length during activities as diverse as eye movement, respiration, locomotion, fine motor control and posture maintenance. Myelination of the axons of the neuronal system is a vertebrate adaptation that enables rapid conduction of action potentials without a commensurate increase in axon diameter. In vitro neuronal systems that reproduce these effects would provide a unique modality to study factors influencing sensory neuronal deficits, neuropathic pain, myelination and diseases associated with myelination. In this dissertation, results for defined in vitro culture conditions resulting in myelination of motoneurons by Schwann cells, pattern controlled myelination of sensory neurons, intrafusal fiber formation, patterned assembly of the mechanosensory complex and integration of the complex on bio-MEMS cantilever devices. Using these systems the stretch sensitive sodium channel BNaC1 and the structural protein PICK1 localized at the sensory neuron terminals associated with the intrafusal fibers was identified as well as the Ca2+ waves associated with sensory neuron electrical activity upon intrafusal fiber stretch on MEMS cantilevers. The knowledge gained through these multi-disciplinary approaches could lead to insights for spasticity inducing diseases like Parkinson's, demyelinating diseases and spinal cord injury repair. These engineered systems also have application in high-throughput drug discovery. Furthermore, the use of biomechanical systems could lead to improved fine motor control for tissue-engineered prosthetic devices.
Show less - Date Issued
- 2009
- Identifier
- CFE0002904, ucf:48013
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0002904
- Title
- Role of Kruppel-like Factor 8 (KLF8) in Cancer and Cardiomyopathy.
- Creator
-
Lahiri, Satadru, Zhao, Jihe, Parthasarathy, Sampath, Masternak, Michal, Siddiqi, Shadab, University of Central Florida
- Abstract / Description
-
Cancer and cardiovascular diseases are two most fatal diseases causing innumerable death each year. Understanding the mechanisms underlying these diseases is critical for developing proper therapeutic approach. Kr(&)#252;ppel-like factor 8 (KLF8) is a member of Kr(&)#252;ppel-like family transcription factors that is overexpressed in many types of cancers. There is no report on role of KLF8 in cardiovascular diseases to date. KLF8 transcriptionally activates or represses a host of target...
Show moreCancer and cardiovascular diseases are two most fatal diseases causing innumerable death each year. Understanding the mechanisms underlying these diseases is critical for developing proper therapeutic approach. Kr(&)#252;ppel-like factor 8 (KLF8) is a member of Kr(&)#252;ppel-like family transcription factors that is overexpressed in many types of cancers. There is no report on role of KLF8 in cardiovascular diseases to date. KLF8 transcriptionally activates or represses a host of target genes to promote cancer cell proliferation, migration, invasion and epithelial to mesenchymal transition during tumor progression. Studies proposed in this thesis identified a novel posttranslational modification of KLF8 essential for its role in promoting cancer cell migration and discovered a novel function of KLF8 in cardiomyopathy. In our first study, we identified serine 48 (S48) as a novel phosphorylation site on KLF8. Pharmacological and genetic manipulations of various potential kinases further revealed ERK2 as the kinase responsible for this novel phosphorylation. Functional studies indicated that this phosphorylation is crucial for protecting KLF8 protein from degradation in the nucleus and promoting cancer cell migration. Preclinical xenograft models have indicated an important role of KLF8 for tumor progression. To investigate role of KLF8 in spontaneous tumorigenesis better recapitulating pathology in patients, we established the first Cre-regulated conditional KLF8 transgenic mouse model. Upon induction of global expression of the KLF8 transgene, spontaneous mammary and testicular tumors were formed in a small population of the mice by their mid-age, as expected considering the long latency required for tumor progression. Surprisingly, however, nearly 100% of KLF8 the mice died with a significantly enlarged heart, which did not occur to any littermate control mouse. Further characterization of the mice revealed that the global expression of the transgene caused striking systolic dysfunction leading to fatal dilated cardiomyopathy. Importantly, these similar phenotypes were reproduced in heart-specific KLF8 transgenic mice. Cardiovascular disease PCR array identified a number of genes potentially mediating KLF8-induced cardiac pathology. These results identified a previously unimagined function of KLF8 in the heart, shed new light on the mechanisms of cardiac diseases and provide novel preclinical mouse models for future translational research.
Show less - Date Issued
- 2016
- Identifier
- CFE0006692, ucf:51914
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006692
- Title
- Macroscopic Crash Analysis and Its Implications for Transportation Safety Planning.
- Creator
-
Siddiqui, Chowdhury, Abdel-Aty, Mohamed, Abdel-Aty, Mohamed, Uddin, Nizam, Huang, Helai, University of Central Florida
- Abstract / Description
-
Incorporating safety into the transportation planning stage, which is often termed as transportation safety planning (TSP), relies on the vital interplay between zone characteristics and zonal traffic crashes. Although a few safety studies had made some effort towards integrating safety and planning, several unresolved problems and a complete framework of TSP are still absent in the literature. This research aims at examining the suitability of the current traffic-related zoning planning...
Show moreIncorporating safety into the transportation planning stage, which is often termed as transportation safety planning (TSP), relies on the vital interplay between zone characteristics and zonal traffic crashes. Although a few safety studies had made some effort towards integrating safety and planning, several unresolved problems and a complete framework of TSP are still absent in the literature. This research aims at examining the suitability of the current traffic-related zoning planning process in a new suggested planning method which incorporates safety measures. In order to accomplish this broader research goal, the study defined its research objectives in the following directions towards establishing a framework of TSP- i) exploring the existing key determinants in traditional transportation planning (e.g., trip generation/distribution data, land use types, demographics, etc.) in order to develop an effective and efficient TSP framework, ii) investigation of the Modifiable Aerial Unit Problem (MAUP) in the context of macro-level crash modeling to investigate the effect of the zone's size and boundary, iii) understanding neighborhood influence of the crashes at or near zonal boundaries, and iv) development of crash-specific safety measure in the four-step transportation planning process.This research was conducted using spatial data from the counties of West Central Florida. Analysis of different crash data per spatial unit was performed using nonparametric approaches (e.g., data mining and random forest), classical statistical methods (e.g., negative binomial models), and Bayesian statistical techniques. In addition, a comprehensive Geographic Information System (GIS) based application tools were utilized for spatial data analysis and representation.Exploring the significant variables related to specific types of crashes is vital in the planning stages of a transportation network. This study identified and examined important variables associated with total crashes and severe crashes per traffic analysis zone (TAZ) by applying nonparametric statistical techniques using different trip related variables and road-traffic related factors. Since a macro-level analysis, by definition, will necessarily involve aggregating crashes per spatial unit, a spatial dependence or autocorrelation may arise if a particular variable of a geographic region is affected by the same variable of the neighboring regions. So far, few safety studies were performed to examine crashes at TAZs and none of them explicitly considered spatial effect of crashes occurring in them. In order to understand the clear picture of spatial autocorrelation of crashes, this study investigated the effect of spatial autocorrelation in modeling pedestrian and bicycle crashes in TAZs. Additionally, this study examined pedestrian crashes at Environmental Justice (EJ) TAZs which were identified in compliance with the various ongoing practices undertaken by Metropolitan Planning Organizations (MPOs) and previous research. Minority population and the low-income group are two important criteria based on which EJ areas are being identified. These unique areal characteristics have been of particular interest to the traffic safety analysts in order to investigate the contributing factors of pedestrian crashes in these deprived areas. Pedestrian and bicycle crashes were estimated as a function of variables related to roadway characteristics, and various demographic and socio-economic factors. It was found that significant differences are present between the predictor sets for pedestrian and bicycle crashes. In all cases the models with spatial correlation performed better than the models that did not account for spatial correlation among TAZs. This finding implied that spatial correlation should be considered while modeling pedestrian and bicycle crashes at the aggregate or macro-level. Also, the significance of spatial autocorrelation was later found in the total and severe crash analyses and accounted for in their respective modeling techniques.Since the study found affirmative evidence about the inclusion of spatial autocorrelation in the safety performance functions, this research considered identifying appropriate spatial entity based on which TSP framework would be developed. A wide array of spatial units has been explored in macro-level crash modeling in previous safety research. With the advancement of GIS, safety analysts are able to analyze crashes for various geographical units. However, a clear guideline on which geographic entity should a modeler choose is not present so far. This preference of spatial unit can vary with the dependent variable of the model. Or, for a specific dependent variable, models may be invariant to multiple spatial units by producing a similar goodness-of-fits. This problem is closely related to the Modifiable Areal Unit Problem which is a common issue in spatial data analysis. The study investigated three different crash (total, severe, and pedestrian) models developed for TAZs, block groups (BGs) and census tracts (CTs) using various roadway characteristics and census variables (e.g., land use, socio-economic, etc.); and compared them based on multiple goodness-of-fit measures.Based on MAD and MSPE it was evident that the total, severe and pedestrian crash models for TAZs and BGs had similar fits, and better than the ones developed for CTs. This indicated that the total, severe and pedestrian crash models are being affected by the size of the spatial units rather than their zoning configurations. So far, TAZs have been the base spatial units of analyses for developing travel demand models. Metropolitan planning organizations widely use TAZs in developing their long range transportation plans (LRTPs). Therefore, considering the practical application it was concluded that as a geographical unit, TAZs had a relative ascendancy over block group and census tract.Once TAZs were selected as the base spatial unit of the TSP framework, careful inspections on the TAZ delineations were performed. Traffic analysis zones are often delineated by the existing street network. This may result in considerable number of crashes on or near zonal boundaries. While the traditional macro-level crash modeling approach assigns zonal attributes to all crashes that occur within the zonal boundary, this research acknowledged the inaccuracy resulting from relating crashes on or near the boundary of the zone to merely the attributes of that zone. A novel approach was proposed to account for the spatial influence of the neighboring zones on crashes which specifically occur on or near the zonal boundaries. Predictive model for pedestrian crashes per zone were developed using a hierarchical Bayesian framework and utilized separate predictor sets for boundary and interior (non-boundary) crashes. It was found that these models (that account for boundary and interior crashes separately) had better goodness-of-fit measures compared to the models which had no specific consideration for crashes located at/near the zone boundaries. Additionally, the models were able to capture some unique predictors associated explicitly with interior and boundary-related crashes. For example, the variables- 'total roadway length with 35mph posted speed limit' and 'long term parking cost' were statistically not significantly different from zero in the interior crash model but they were significantly different from zero at the 95% level in the boundary crash model.Although an adjacent traffic analysis zones (a single layer) were defined for pedestrian crashes and boundary pedestrian crashes were modeled based on the characteristic factors of these adjacent zones, this was not considered reasonable for bicycle-related crashes as the average roaming area of bicyclists are usually greater than that of pedestrians. For smaller TAZs sometimes it is possible for a bicyclist to cross the entire TAZ. To account for this greater area of coverage, boundary bicycle crashes were modeled based on two layers of adjacent zones. As observed from the goodness-of-fit measures, performances of model considering single layer variables and model considering two layer variables were superior from the models that did not consider layering at all; but these models were comparable. Motor vehicle crashes (total and severe crashes) were classified as 'on-system' and 'off-system' crashes and two sub-models were fitted in order to calibrate the safety performance function for these crashes. On-system and off-system roads refer to two different roadway hierarchies. On-system or state maintained roads typically possess higher speed limit and carries traffic from distant TAZs. Off-system roads are, however, mostly local roads with relatively low speed limits. Due to these distinct characteristics, on-system crashes were modeled with only population and total employment variables of a zone in addition to the roadway and traffic variables; and all other zonal variables were disregarded. For off-system crashes, on contrary, all zonal variables was considered. It was evident by comparing this on- and off-system sub-model-framework to the other candidate models that it provided superior goodness-of-fit for both total and severe crashes.Based on the safety performance functions developed for pedestrian, bicycle, total and severe crashes, the study proposed a novel and complete framework for assessing safety (of these crash types) simultaneously in parallel with the four-step transportation planning process with no need of any additional data requirements from the practitioners' side.
Show less - Date Issued
- 2012
- Identifier
- CFE0004191, ucf:49009
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004191
- Title
- Development of a Cognitive Work Analysis Framework Tutorial Using Systems Modeling Language.
- Creator
-
Wells, Wilfred, Karwowski, Waldemar, Williams, Kent, Sala-Diakanda, Serge, Elshennawy, Ahmad, Ahram, Tareq, University of Central Florida
- Abstract / Description
-
At the present time, most systems engineers do not have access to cognitivework analysis information or training in terms they can understand. This may lead to adisregard of the cognitive aspect of system design. The impact of this issue is systemrequirements that do not account for the cognitive strengths and limitations of users.Systems engineers cannot design effective decision support systems without definingcognitive work requirements. In order to improve system requirements, integration...
Show moreAt the present time, most systems engineers do not have access to cognitivework analysis information or training in terms they can understand. This may lead to adisregard of the cognitive aspect of system design. The impact of this issue is systemrequirements that do not account for the cognitive strengths and limitations of users.Systems engineers cannot design effective decision support systems without definingcognitive work requirements. In order to improve system requirements, integration ofcognitive work requirements into the systems engineering process has to be improved.One option to address this gap is the development of a Cognitive Work Analysis (CWA)framework using Systems Modeling Language (SysML). The study had two phases.The first involved aligning the CWA terminology with the SysML to produce a CWAframework using SysML. The second was the creation of an instruction using SysML toinform systems engineers of the process of integrating cognitive work requirements intothe systems engineering process. This methodology provides a structured framework todefine, manage, organize, and model cognitive work requirements. Additionally, itprovides a tool for systems engineers to use in system design which supports a user'scognitive functions, such as situational awareness, problem solving, and decisionmaking.
Show less - Date Issued
- 2011
- Identifier
- CFE0004177, ucf:49079
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004177
- Title
- Critical Success Factors for Evolutionary Acquisition Implementation.
- Creator
-
Bjorn, Brig, Kotnour, Timothy, Karwowski, Waldemar, Mollaghasemi, Mansooreh, Farr, John, University of Central Florida
- Abstract / Description
-
Due to extensive challenges to the efficient development and fielding of operationally effective and affordable weapon systems, the U.S. employs a complex management framework to govern defense acquisition programs. The Department of Defense and Congress recently modified this process to improve the levels of knowledge available at key decision points in order to reduce lifecycle cost, schedule, and technical risk to programs. This exploratory research study employed multiple methods to...
Show moreDue to extensive challenges to the efficient development and fielding of operationally effective and affordable weapon systems, the U.S. employs a complex management framework to govern defense acquisition programs. The Department of Defense and Congress recently modified this process to improve the levels of knowledge available at key decision points in order to reduce lifecycle cost, schedule, and technical risk to programs. This exploratory research study employed multiple methods to examine the impact of systems engineering reviews, competitive prototyping, and the application of a Modular Open Systems Approach on knowledge and risk prior to funding system implementation and production. In-depth case studies of two recent Major Defense Acquisition Programs were conducted to verify the existence and relationships of the proposed constructs and identify potential barriers to program success introduced by the new process. The case studies included program documentation analysis as well as interviews with contractor personnel holding multiple roles on the program. A questionnaire-based survey of contractor personnel from a larger set of programs was executed to test the case study findings against a larger data set. The study results indicate that while some changes adversely affected program risk levels, the recent modifications to the acquisition process generally had a positive impact on levels of critical knowledge at the key Milestone B decision point. Based on the results of this study it is recommended that the Government improve its ability to communicate with contractors during competitive phases, particularly with regard to requirements management, and establish verifiable criteria for compliance with theModular Open Systems Approach. Additionally, the Government should clarify the intent of competitive prototyping and develop a strategy to better manage the inevitable gaps between program phases. Contractors are recommended to present more requirements trade-offs and focus less on prototype development during the Technology Development phases of programs. The results of this study may be used by policy makers to shape future acquisition reforms; by Government personnel to improve the implementation of the current regulations; and by contractors to shape strategies and processes for more effective system development. This research may be used by the Government to improve the execution of acquisition programs under this new paradigm. The defense industrial base can use this research to better understand the impacts of the new process and improve strategic planning processes. The research methodology may be applied to new and different types of programs to assess improvement in the execution process over time.
Show less - Date Issued
- 2012
- Identifier
- CFE0004358, ucf:49442
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004358
- Title
- The effects of altered traffic signs upon vehicular driving modes and consequent fuel conservation and environmental benefits, as measured by vehicular noise-imprints.
- Creator
-
Pfarrer, Mark Daniel, null, null, Engineering
- Abstract / Description
-
Florida Technological University College of Engineering Thesis; The hypothesis is that the recorded noise-imprints of a vehicle at an intersection can be used to identify and accurately time the driving modes of deceleration, idle, slow cruise, and acceleration. This is proven by analyzing and comparing noise-imprints of vehicles at an uncontrolled intersection marked first with a 'stop' sign, and then by a 'yield' and an experimental 'dead slow' sign. By relating the duration of each driving...
Show moreFlorida Technological University College of Engineering Thesis; The hypothesis is that the recorded noise-imprints of a vehicle at an intersection can be used to identify and accurately time the driving modes of deceleration, idle, slow cruise, and acceleration. This is proven by analyzing and comparing noise-imprints of vehicles at an uncontrolled intersection marked first with a 'stop' sign, and then by a 'yield' and an experimental 'dead slow' sign. By relating the duration of each driving mode to known relations, the overall efficiency of an intersection can be characterized. A new technique for studying various types of traffic conditions at intersections is the result. Initial noise-imprint analysis and comparison shows that a 'yield' sign is to be preferred over a 'stop' sign to decrease travel time, air pollution emissions, gasoline consumption, and wear-and-tear on the car. The experimental 'dead slow' sign is used as a demonstration of the noise-imprint technique upon an unknown situation. The efficiency of a 'dead slow' sign proved to be less than that of a 'yield' sign, but still greater than that of a 'stop' sign.
Show less - Date Issued
- 1976
- Identifier
- CFR0003525, ucf:52986
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFR0003525
- Title
- Model-Based Systems Engineering Approach to Distributed and Hybrid Simulation Systems.
- Creator
-
Pastrana, John, Rabelo, Luis, Lee, Gene, Elshennawy, Ahmad, Kincaid, John, University of Central Florida
- Abstract / Description
-
INCOSE defines Model-Based Systems Engineering (MBSE) as (")the formalized application of modeling to support system requirements, design, analysis, verification, and validation activities beginning in the conceptual design phase and continuing throughout development and later life cycle phases.(") One very important development is the utilization of MBSE to develop distributed and hybrid (discrete-continuous) simulation modeling systems. MBSE can help to describe the systems to be modeled...
Show moreINCOSE defines Model-Based Systems Engineering (MBSE) as (")the formalized application of modeling to support system requirements, design, analysis, verification, and validation activities beginning in the conceptual design phase and continuing throughout development and later life cycle phases.(") One very important development is the utilization of MBSE to develop distributed and hybrid (discrete-continuous) simulation modeling systems. MBSE can help to describe the systems to be modeled and help make the right decisions and partitions to tame complexity. The ability to embrace conceptual modeling and interoperability techniques during systems specification and design presents a great advantage in distributed and hybrid simulation systems development efforts. Our research is aimed at the definition of a methodological framework that uses MBSE languages, methods and tools for the development of these simulation systems. A model-based composition approach is defined at the initial steps to identify distributed systems interoperability requirements and hybrid simulation systems characteristics. Guidelines are developed to adopt simulation interoperability standards and conceptual modeling techniques using MBSE methods and tools. Domain specific system complexity and behavior can be captured with model-based approaches during the system architecture and functional design requirements definition. MBSE can allow simulation engineers to formally model different aspects of a problem ranging from architectures to corresponding behavioral analysis, to functional decompositions and user requirements (Jobe, 2008).
Show less - Date Issued
- 2014
- Identifier
- CFE0005395, ucf:50464
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005395
- Title
- An Improved Biosolid Gasifier Model.
- Creator
-
McLean, Hannah, Cooper, David, Randall, Andrew, Lee, Woo Hyoung, University of Central Florida
- Abstract / Description
-
As populations increase and cities become denser, the production of waste, both sewage sludge and food biomass, increases exponentially while disposal options for these wastes are limited. Landfills have minimal space for biosolids; countries are now banning ocean disposal methods for fear of the negative environmental impacts. Agricultural application of biosolids cannot keep up with the production rates because of the accumulation of heavy metals in the soils. Gasification can convert...
Show moreAs populations increase and cities become denser, the production of waste, both sewage sludge and food biomass, increases exponentially while disposal options for these wastes are limited. Landfills have minimal space for biosolids; countries are now banning ocean disposal methods for fear of the negative environmental impacts. Agricultural application of biosolids cannot keep up with the production rates because of the accumulation of heavy metals in the soils. Gasification can convert biosolids into a renewable energy source that can reduce the amount of waste heading to the landfills and reduce our dependence on fossil fuels. A recently published chemical kinetic computer model for a fluidized-bed sewage sludge gasifier (Champion, Cooper, Mackie, (&) Cairney, 2014) was improved in this work based on limited experimental results obtained from a bubbling fluidized-bed sewage sludge gasifier at the MaxWest facility in Sanford, Florida and published information from the technical literature. The gasifier processed sewage sludge from the communities surrounding Sanford and was operated at various air equivalence ratios and biosolid feed rates. The temperature profile inside of the gasifier was recorded over the span of four months, and an average profile was used in the base case scenario. The improved model gave reasonable predictions of the axial bed temperature profile, syngas composition, heating value of the syngas, gas flow rate, and carbon conversion. The model was validated by comparing the simulation temperature profile data with the measured temperature profile data. An overall heat loss coefficient was calculated for the gasification unit to provide a more accurate energy balance. Once the model was equipped with a heat loss coefficient, the output syngas temperature closely matched the operational data from the MaxWest facility.The model was exercised at a constant equivalence ratio at varying temperatures, and again using a constant temperature with varying equivalence ratios. The resulting syngas compositions from these exercises were compared to various literature sources. It was decided that some of the reactions kinetics needed to be adjusted so that the change in syngas concentration versus change in bed temperature would more closely match the literature. The reaction kinetics for the Water-Gas Shift and Boudouard reactions were modified back to their original values previously obtained from the literature.
Show less - Date Issued
- 2015
- Identifier
- CFE0005663, ucf:50199
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005663
- Title
- Optimization of Block Layout and Evaluation of Collection Mat Materials for Polyacrylamide Treatment Channels.
- Creator
-
McDougal, Alicia, Chopra, Manoj, Nam, Boo Hyun, Wang, Dingbao, University of Central Florida
- Abstract / Description
-
Construction sites are frequently cited as major sources of pollution that degrade the quality of surface water. The highly erodible topsoil is transported off site by stormwater runoff causing negative effects downstream. Research has shown that the small particles, which are the most susceptible to erosive forces, have more pollutants associated with them than larger soil particles. Currently, in the state of Florida, it is not permissible to discharge water to a receiving water body if the...
Show moreConstruction sites are frequently cited as major sources of pollution that degrade the quality of surface water. The highly erodible topsoil is transported off site by stormwater runoff causing negative effects downstream. Research has shown that the small particles, which are the most susceptible to erosive forces, have more pollutants associated with them than larger soil particles. Currently, in the state of Florida, it is not permissible to discharge water to a receiving water body if the turbidity is more than 29 Nephelometric Turbidity Units (NTUs) above background or higher than background for an outstanding Florida water body. The removal of fine suspended sediment from water can be achieved by filtration, settling, and the use of chemical coagulants. Polyacrylamide (PAM), a coagulant, has been shown to be effective in removing fine suspended particles from water via coagulation and flocculation. The Stormwater Management Academy at the University of Central Florida has researched the use of PAM and collection mats in a treatment channel to meet state discharge requirements. In this study, turbid water using sediment from typical Florida soils was simulated and passed through a channel. The channel contained polymer blocks in a configuration previously determined to be the most effective. An important component of the treatment system is the floc collection. This research examined three types of collection mats, namely jute, coconut fiber and polypropylene mix to collect the flocs. This thesis presents the results of this investigation.The results for the sandy soil tests showed an average removal efficiency prior to the collection mat starting at 71% and decreasing to 44% at the end of the tests. The 20-foot coconut mat maintained an average removal efficiency of 90%. The turbidity due to silty-sandy soil was decreased with an average removal efficiency prior to the collection mat ranging from 50% to 65%. The average removal efficiency for the 20-foot coconut mat started at 85%and decreased to 60% during the tests. The turbidity due to crushed limestone showed an average removal efficiency prior to the collection mat ranging from 81% down to 69% over time. The average results from the 20-foot coconut mat ranged from 65% to 80%. Turbidity was tested on the samples under two conditions, a 30 second settling time and completely mixed. Statistical results show a significant decrease (?=0.05) in turbidity between the mixed and settled samples.Statistical analyses were performed on the collected data, which concluded that the capability of the mat to reduce turbidity can be repeated with a 95% confidence interval. The 20-foot length coconut mat had the highest turbidity removal efficiency for every soil type examined. Further statistical analysis showed that the achieved turbidity reduction was significantly different (?=0.05) for the various materials. It was observed that generally, each type of mat clogged during testing indicating that longer collection mats be used, possibly lining the entire channel. Recommendations from this study are to provide a settling area after the collection mats and line the entire length of the channel with the collection mat selected.
Show less - Date Issued
- 2014
- Identifier
- CFE0005210, ucf:50628
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005210
- Title
- AN HEDONOMIC EVALUATION OF PLEASURABLE HUMAN-TECHNOLOGY EXPERIENCE: THE EFFECT OF EXPOSURE AND AESTHETICS ON THE EXPERIENCE OF FLOW.
- Creator
-
Murphy, Lauren, Hancock, Peter, University of Central Florida
- Abstract / Description
-
A framework was developed called the Extended Hedonomic Hierarchy (EHH) that provides a basis for evaluating pleasurable human-system experience. Results from a number of experiments within this framework that evaluated specific dimensions of the framework are reported. The 'Exposure' component of the EHH framework and hedonics of the system were investigated to see how changes would affect other dimensions, such as the occurrence of flow, the mode of interaction, and the needs of the user....
Show moreA framework was developed called the Extended Hedonomic Hierarchy (EHH) that provides a basis for evaluating pleasurable human-system experience. Results from a number of experiments within this framework that evaluated specific dimensions of the framework are reported. The 'Exposure' component of the EHH framework and hedonics of the system were investigated to see how changes would affect other dimensions, such as the occurrence of flow, the mode of interaction, and the needs of the user. Simulations and video games were used to investigate how repeated exposure affects flow, interaction mode, and the user needs. The Kansei Engineering method was used to measure user needs and investigate the effect of different hedonic properties of the system on user needs and flow. Findings reveal that: (a) pleasurable human-system experience increases linearly with repeated exposure to the technology of interest; (b) an habituation effect of flow mediated by day; (c) motivation to satisfy human need for technology is hierarchically structured and contributes to pleasurable human-system experience; (d) interactivity is hierarchically structured and seamless mode of interaction is a behavioral outcome of pleasurable human-system experience; (e) there are individual differences among users that affect the likelihood of experiencing pleasurable human-system interaction; (f) performance is positively correlated to flow and (g) the method of kansei engineering provides data from which informed decisions about design can be made and empirical research can be conducted. Suggestions for (a) making Hedonomics a reality in industry, the workplace, and in the field of Human Factors, (b) future research directions for Hedonomics, and (c) principles and guidelines for the practice of Hedonomics are discussed.
Show less - Date Issued
- 2005
- Identifier
- CFE0000875, ucf:46650
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000875
- Title
- A STUDY OF CENTRAL FLORIDA NONROAD VOC AND NOX EMISSIONS AND POTENTIAL ACTIONS TO REDUCE EMISSIONS.
- Creator
-
Radford, Michael, Cooper, C. David, University of Central Florida
- Abstract / Description
-
Ground-level ozone is harmful to the human respiratory system, as well as the environment. The national EPA 8-hour ozone standard for ground-level ozone was reduced from 85 parts per billion (ppb) to 75 ppb in 2008, and trends from previous years show that some of the counties in Central Florida could be in danger of violation. Violation means "non attainment" status; in which the county is ordered by EPA to develop specific implementation plans to reduce its emissions. The objective of this...
Show moreGround-level ozone is harmful to the human respiratory system, as well as the environment. The national EPA 8-hour ozone standard for ground-level ozone was reduced from 85 parts per billion (ppb) to 75 ppb in 2008, and trends from previous years show that some of the counties in Central Florida could be in danger of violation. Violation means "non attainment" status; in which the county is ordered by EPA to develop specific implementation plans to reduce its emissions. The objective of this study was to compile an emissions inventory of volatile organic compounds (VOCs) and nitrogen oxides (NOx) from nonroad equipment in Osceola, Seminole, and Orange Counties (OSO) in Central Florida, and to develop possible action steps to reduce those emissions. This is important because VOC and NOx emissions are precursors to ground-level ozone. Thus, compiling emissions inventories is important to identify high VOC and NOx emitters. Mobile and point sources have long been the highest emitters of VOC and NOx and have therefore been targeted and monitored since the Clean Air Act of 1970, but the nonroad sources (such as construction and lawn equipment) have only been regulated since the 1990s. Using the NONROAD and NMIM modeling programs, the highest nonroad emitters of VOC for Central Florida were found to be lawn/garden equipment, and boating equipment, emitting a combined percentage of 77% of the total nonroad mobile source VOC. Construction equipment contributed 67% of the total nonroad mobile source emissions of NOx in Central Florida. The components of these categories were also analyzed to find the largest individual sources of VOC and NOx. Of the individual sources, lawn mowers and outboard boat engines were found to be the largest sources of VOCs. Of the NOx sources, all the construction equipment components had a relatively similar level of NOx emissions. Next, action steps were developed to reduce emissions, focusing on the high emitters, along with an estimated cost and feasibility for each measure. Of these steps, implementing a ban on leafblowers, and reducing use of lawn mowers, edgers, trimmers, etc. seemed to be the most effective for reducing VOCs. Although these are effective measures, the cost and feasibility of both pose challenges. The best action step for reducing NOx emissions in construction equipment seemed to be by simply reducing idling of equipment on job sites. This also poses challenges in feasibility and enforcement by management. Further, constant on/off cycles could result in decreasing the useful life of the older construction equipment. Finally, a survey was conducted with various construction managers and companies to find out the typical equipment and quantity needed for land clearing/grubbing, as well as the typical use, idling time, and total project time for each piece of equipment on a 10-acre site, under various conditions. The purpose of the study was to develop a rough estimate for the average amount of VOC and NOx emissions that will be produced per acre of land clearing activities, and to estimate the emissions reductions and cost savings if idling of the equipment was reduced.
Show less - Date Issued
- 2009
- Identifier
- CFE0002850, ucf:48064
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0002850
- Title
- Video categorization using semantics and semiotics.
- Creator
-
Rasheed, Zeeshan, Shah, Mubarak, Engineering and Computer Science
- Abstract / Description
-
University of Central Florida College of Engineering Thesis; There is a great need to automatically segment, categorize, and annotate video data, and to develop efficient tools for browsing and searching. We believe that the categorization of videos can be achieved by exploring the concepts and meanings of the videos. This task requires bridging the gap between low-level content and high-level concepts (or semantics). Once a relationship is established between the low-level computable...
Show moreUniversity of Central Florida College of Engineering Thesis; There is a great need to automatically segment, categorize, and annotate video data, and to develop efficient tools for browsing and searching. We believe that the categorization of videos can be achieved by exploring the concepts and meanings of the videos. This task requires bridging the gap between low-level content and high-level concepts (or semantics). Once a relationship is established between the low-level computable features of the video and its semantics, .the user would be able to navigate through videos through the use of concepts and ideas (for example, a user could extract only those scenes in an action film that actually contain fights) rat her than sequentially browsing the whole video. However, this relationship must follow the norms of human perception and abide by the rules that are most often followed by the creators (directors) of these videos. These rules are called film grammar in video production literature. Like any natural language, this grammar has several dialects, but it has been acknowledged to be universal. Therefore, the knowledge of film grammar can be exploited effectively for the understanding of films. To interpret an idea using the grammar, we need to first understand the symbols, as in natural languages, and second, understand the rules of combination of these symbols to represent concepts. In order to develop algorithms that exploit this film grammar, it is necessary to relate the symbols of the grammar to computable video features.
Show less - Date Issued
- 2003
- Identifier
- CFR0001717, ucf:52920
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFR0001717
- Title
- Larger-first partial parsing.
- Creator
-
Van Delden, Sebastian Alexander, Gomez, Fernando, Engineering and Computer Science
- Abstract / Description
-
University of Central Florida College of Engineering Thesis; Larger-first partial parsing is a primarily top-down approach to partial parsing that is opposite to current easy-first, or primarily bottom-up, strategies. A rich partial tree structure is captured by an algorithm that assigns a hierarchy of structural tags to each of the input tokens in a sentence. Part-of-speech tags are first assigned to the words in a sentence by a part-of-speech tagger. A cascade of Deterministic Finite State...
Show moreUniversity of Central Florida College of Engineering Thesis; Larger-first partial parsing is a primarily top-down approach to partial parsing that is opposite to current easy-first, or primarily bottom-up, strategies. A rich partial tree structure is captured by an algorithm that assigns a hierarchy of structural tags to each of the input tokens in a sentence. Part-of-speech tags are first assigned to the words in a sentence by a part-of-speech tagger. A cascade of Deterministic Finite State Automata then uses this part-of-speech information to identify syntactic relations primarily ina descending order of their size. The cascade is divided into four specialized sections: (1) a Comma Network, which identifies syntactic relations associated with commas; (2) a Conjunction Network, which partially disambiguates phrasal conjunctions and fully disambiguates clausal conjunctions; (3) a Clause Network, which identifies non-comma-delimited clauses; and (4) a Phrase Network, which identifies the remaining base phrases in the sentence. Each automaton is capable of adding one ore more levels of structural tags to the to the tokens in a sentence. The larger-first approach is compared against a well-known easy-first approach. The results indicate that this larger-first approach is capable of (1) producing a more detailed partial parse than an easy first approach; (2) providing better containment of attachment ambiguity; (3) handling overlapping syntactic relations; and (4) achieving a higher accuracy than the easy-first approach. The automata of each network were developed by an empirical analysis of several sources and are presented here in details.
Show less - Date Issued
- 2003
- Identifier
- CFR0000760, ucf:52932
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFR0000760
- Title
- Crash quality- an approach for evaluating spending on quality improvement initiatives.
- Creator
-
Ferreira, Labiche, Hosni, Yasser A., Engineering
- Abstract / Description
-
University of Central Florida College of Engineering Thesis; The quality movement has become popular among corporations big and small for one reason: empirical evidence suggests that quality and productivity (and hence profitability) are linked. Unfortunately, while many firms accept that quality and productivity go together, few actually track the gains associated with their quality improvement programs. Companies also tend to spend on quality improvement with no indication of estimation of...
Show moreUniversity of Central Florida College of Engineering Thesis; The quality movement has become popular among corporations big and small for one reason: empirical evidence suggests that quality and productivity (and hence profitability) are linked. Unfortunately, while many firms accept that quality and productivity go together, few actually track the gains associated with their quality improvement programs. Companies also tend to spend on quality improvement with no indication of estimation of the impact of funding on the targeted process. It would be of great value to know: (1) the impact of spending to enhance the product/process quality level, and (2) the point at which expenditures for quality improvement are not economical. This research involves modeling the quality level of a product composed of integrated components/processes and the costs associated with quality improvement. Presented in this research is a methodology for determining the point at which the target quality level is reached. This point signifies when future spending should be re-directed. The research defines this point as the "Crash Quality Point (CQP)." Cases of a single process level and double level three-stage process are modeled to conceptualize CQP. The finding from the output analysis reveal that the quality level approaches the target level at varying points in time. Any spending beyond this point does not have an impact on the quality level compared to the period prior to the Crash Quality Point. Spending past this point is futile and these funds could be spent on the quality improvement projects. The special case modeled also illustrates the use of this tool in the selection of processes for improvements based on the quality level of the process. This is an added advantage in scenarios where funds are limited and management is constrained to improve process quality with limited funds. Using a real world example validates the proposed CQP methodology. The results of the validation indicate that the model developed can assist managers in forecasting the budget requirements for quality spending based on the quality improvement goals. The tool also enables managers to estimate the point in time at which allocations of funds may be directed for process reengineering. The CQP method will enable quality improvement professionals to determine the economical viability and the limits in expenditures on quality improvement. It enables managers to evaluate spending alternatives and approximate when the point of diminishing return is reached.
Show less - Date Issued
- 2000
- Identifier
- CFR0011594, ucf:53046
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFR0011594
- Title
- A Hybrid Simulation Framework of Consumer-to-Consumer Ecommerce Space.
- Creator
-
Joledo, Oloruntomi, Rabelo, Luis, Lee, Gene, Elshennawy, Ahmad, Ajayi, Richard, University of Central Florida
- Abstract / Description
-
In the past decade, ecommerce transformed the business models of many organizations. Information Technology leveled the playing field for new participants, who were capable of causing disruptive changes in every industry. (")Web 2.0(") or (")Social Web(") further redefined ways users enlist for services. It is now easy to be influenced to make choices of services based on recommendations of friends and popularity amongst peers. This research proposes a simulation framework to investigate how...
Show moreIn the past decade, ecommerce transformed the business models of many organizations. Information Technology leveled the playing field for new participants, who were capable of causing disruptive changes in every industry. (")Web 2.0(") or (")Social Web(") further redefined ways users enlist for services. It is now easy to be influenced to make choices of services based on recommendations of friends and popularity amongst peers. This research proposes a simulation framework to investigate how actions of stakeholders at this level of complexity affect system performance as well as the dynamics that exist between different models using concepts from the fields of operations engineering, engineering management, and multi-model simulation. Viewing this complex model from a systems perspective calls for the integration of different levels of behaviors. Complex interactions exist among stakeholders, the environment and available technology. The presence of continuous and discrete behaviors coupled with stochastic and deterministic behaviors present challenges for using standalone simulation tools to simulate the business model.We propose a framework that takes into account dynamic system complexity and risk from a hybrid paradigm. The SCOR model is employed to map the business processes and it is implemented using agent based simulation and system dynamics. By combining system dynamics at the strategy level with agent based models of consumer behaviors, an accurate yet efficient representation of the business model that makes for sound basis of decision making can be achieved to maximize stakeholders' utility.
Show less - Date Issued
- 2016
- Identifier
- CFE0006122, ucf:51171
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006122