Current Search: Sepulveda, Jose (x)
View All Items
- Title
- A HYBRID SIMULATION METHODOLOGY TO EVALUATE NETWORK CENTRICDECISION MAKING UNDER EXTREME EVENTS.
- Creator
-
Quijada, Sergio, Sepulveda, Jose, University of Central Florida
- Abstract / Description
-
Currently the network centric operation and network centric warfare have generated a new area of research focused on determining how hierarchical organizations composed by human beings and machines make decisions over collaborative environments. One of the most stressful scenarios for these kinds of organizations is the so-called extreme events. This dissertation provides a hybrid simulation methodology based on classical simulation paradigms combined with social network analysis for...
Show moreCurrently the network centric operation and network centric warfare have generated a new area of research focused on determining how hierarchical organizations composed by human beings and machines make decisions over collaborative environments. One of the most stressful scenarios for these kinds of organizations is the so-called extreme events. This dissertation provides a hybrid simulation methodology based on classical simulation paradigms combined with social network analysis for evaluating and improving the organizational structures and procedures, mainly the incident command systems and plans for facing those extreme events. According to this, we provide a methodology for generating hypotheses and afterwards testing organizational procedures either in real training systems or simulation models with validated data. As long as the organization changes their dyadic relationships dynamically over time, we propose to capture the longitudinal digraph in time and analyze it by means of its adjacency matrix. Thus, by using an object oriented approach, three domains are proposed for better understanding the performance and the surrounding environment of an emergency management organization. System dynamics is used for modeling the critical infrastructure linked to the warning alerts of a given organization at federal, state and local levels. Discrete simulations based on the defined concept of "community of state" enables us to control the complete model. Discrete event simulation allows us to create entities that represent the data and resource flows within the organization. We propose that cognitive models might well be suited in our methodology. For instance, we show how the team performance decays in time, according to the Yerkes-Dodson curve, affecting the measures of performance of the whole organizational system. Accordingly we suggest that the hybrid model could be applied to other types of organizations, such as military peacekeeping operations and joint task forces. Along with providing insight about organizations, the methodology supports the analysis of the "after action review" (AAR), based on collection of data obtained from the command and control systems or the so-called training scenarios. Furthermore, a rich set of mathematical measures arises from the hybrid models such as triad census, dyad census, eigenvalues, utilization, feedback loops, etc., which provides a strong foundation for studying an emergency management organization. Future research will be necessary for analyzing real data and validating the proposed methodology.
Show less - Date Issued
- 2006
- Identifier
- CFE0001243, ucf:46926
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001243
- Title
- A FRAMEWORK TO MODEL COMPLEX SYSTEMS VIA DISTRIBUTED SIMULATION A CASE STUDY OF THE VIRTUAL TEST BED SIMULATION SYSTEM USING THE HIGH LEVEL ARCHITECTURE.
- Creator
-
Park, Jaebok, Sepulveda, Jose, University of Central Florida
- Abstract / Description
-
As the size, complexity, and functionality of systems we need to model and simulate con-tinue to increase, benefits such as interoperability and reusability enabled by distributed discrete-event simulation are becoming extremely important in many disciplines, not only military but also many engineering disciplines such as distributed manufacturing, supply chain management, and enterprise engineering, etc. In this dissertation we propose a distributed simulation framework for the development...
Show moreAs the size, complexity, and functionality of systems we need to model and simulate con-tinue to increase, benefits such as interoperability and reusability enabled by distributed discrete-event simulation are becoming extremely important in many disciplines, not only military but also many engineering disciplines such as distributed manufacturing, supply chain management, and enterprise engineering, etc. In this dissertation we propose a distributed simulation framework for the development of modeling and the simulation of complex systems. The framework is based on the interoperability of a simulation system enabled by distributed simulation and the gateways which enable Com-mercial Off-the-Shelf (COTS) simulation packages to interconnect to the distributed simulation engine. In the case study of modeling Virtual Test Bed (VTB), the framework has been designed as a distributed simulation to facilitate the integrated execution of different simulations, (shuttle process model, Monte Carlo model, Delay and Scrub Model) each of which is addressing differ-ent mission components as well as other non-simulation applications (Weather Expert System and Virtual Range). Although these models were developed independently and at various times, the original purposes have been seamlessly integrated, and interact with each other through Run-time Infrastructure (RTI) to simulate shuttle launch related processes. This study found that with the framework the defining properties of complex systems - interaction and emergence are realized and that the software life cycle models (including the spiral model and prototyping) can be used as metaphors to manage the complexity of modeling and simulation of the system. The system of systems (a complex system is intrinsically a "system of systems") continuously evolves to accomplish its goals, during the evolution subsystems co-ordinate with one another and adapt with environmental factors such as policies, requirements, and objectives. In the case study we first demonstrate how the legacy models developed in COTS simulation languages/packages and non-simulation tools can be integrated to address a compli-cated system of systems. We then describe the techniques that can be used to display the state of remote federates in a local federate in the High Level Architecture (HLA) based distributed simulation using COTS simulation packages.
Show less - Date Issued
- 2005
- Identifier
- CFE0000534, ucf:46416
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000534
- Title
- ANALYSIS AND INTEGRATION OF A DEBRIS MODEL IN THE VIRTUAL RANGE PROJECT.
- Creator
-
Robledo, Luis, Sepulveda, Jose, University of Central Florida
- Abstract / Description
-
After the accident of the STS 107 Columbia Space Shuttle, great concern has been focused on the risk associated to the population on the ground. Before this accident happened, re-entry routes as well as risk calculation of were not of public concern. Two issues that have been raised from this lamentable accident relate to spacecraft security and to public safety. The integration of a debris model has been part of the original conceptual architecture of the Virtual Range Project. Its...
Show moreAfter the accident of the STS 107 Columbia Space Shuttle, great concern has been focused on the risk associated to the population on the ground. Before this accident happened, re-entry routes as well as risk calculation of were not of public concern. Two issues that have been raised from this lamentable accident relate to spacecraft security and to public safety. The integration of a debris model has been part of the original conceptual architecture of the Virtual Range Project. Its integration has been considered as a specific research due to the complexity of the models and the difficulties to obtain them since the commercial off-the-shelf available software seems to be less accessible. This research provides solid information concerning what debris fragmentation models are, their fundamentals, their weaknesses and strengths. The research provides information of the main debris models being currently used by NASA which have direct relationship with the space programs conducted. This study also addresses the integration of a debris model into the Virtual Range Project. We created a provisional model based on the distribution of the Columbia debris fragments over Texas and part of Louisiana in order to create an analytical methodology as well. This analysis shows a way of integrating this debris model with a Geographic Information System as well as the integration of several raster and vector data sets which will provide the source data to compute the calculations. This research uses population data sets that allow the determination of the number of people at risk on the ground. The graphical and numerical analysis made can lead to the determination of new and more secure re-entry trajectories as well as further population-related security issues concerning this type of flights.
Show less - Date Issued
- 2004
- Identifier
- CFE0000193, ucf:46175
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000193
- Title
- STUDY FOR DEVELOPMENT OF A BLAST LAYER FOR THE VIRTUAL RANGE PROJECT.
- Creator
-
Rosales, Sergio, Sepulveda, Jose, University of Central Florida
- Abstract / Description
-
In this work we develop a Blast-Propellant-Facility integrated analysis study, which evaluates, by using two different approaches, the blast-related impact of an explosive accident of the Space Shuttle during the first ten seconds after launch at Kennedy Space Center. The blast-related risk associated with an explosion at this stage is high because of the quantity of energy involved in both multiple and complex processes. To do this, one of our approaches employed BlastFX®, a software...
Show moreIn this work we develop a Blast-Propellant-Facility integrated analysis study, which evaluates, by using two different approaches, the blast-related impact of an explosive accident of the Space Shuttle during the first ten seconds after launch at Kennedy Space Center. The blast-related risk associated with an explosion at this stage is high because of the quantity of energy involved in both multiple and complex processes. To do this, one of our approaches employed BlastFX®, a software system that facilitates the estimation of the level of damage to people and buildings, starting from an explosive device and rendering results through a complete report that illustrates and facilitates the evaluation of consequences. Our other approaches employed the Hopkinson-Cranz Scaled Law for estimating similar features at a more distant distance and by evaluating bigger amounts of TNT equivalent. Specifically, we considered more than 500 m and 45,400 kg, respectively, which are the range and TNT content limits that our version of BlastFX® can cover. Much research has been done to study the explosion phenomena with respect to both solid and liquid propellants and the laws that underlie the blast waves of an explosion. Therefore our methodology is based on the foundation provided by a large set of literature review and the actual capacities of an application like BlastFX®. By using and integrating the lessons from the literature and the capabilities of the software, we have obtained very useful information for evaluating different scenarios that rely on the assumption, which is largely studied, that the blast waves' behavior is affected by the distance. All of this has been focused on the Space Shuttle system, in which propellant mass represents the source of our analysis and the core of this work. Estimating the risks involved in it and providing results based on different scenarios augments the collective knowledge of risks associated with space exploration.
Show less - Date Issued
- 2004
- Identifier
- CFE0000190, ucf:46171
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000190
- Title
- FRAMEWORK FOR COST MODELING A SUPPLY CHAIN.
- Creator
-
Yousef, Nabeel, Sepulveda, Jose, University of Central Florida
- Abstract / Description
-
Researchers are interested in value chain analysis to identify the different opportunities for cost savings. The literature have been narrow in scope and addressed specific problems; however none has addressed the need for a general framework that can be used as a standard template in the supply chain cost management and optimization, though Dekker and Goor (2000) said that the goal was to develop a model that would allow direct comparison of specific activities between firms, such as...
Show moreResearchers are interested in value chain analysis to identify the different opportunities for cost savings. The literature have been narrow in scope and addressed specific problems; however none has addressed the need for a general framework that can be used as a standard template in the supply chain cost management and optimization, though Dekker and Goor (2000) said that the goal was to develop a model that would allow direct comparison of specific activities between firms, such as warehousing activities costs. There was no indication in the literature of a cost model that can identify all costs and cost drivers through the supply chain. Some firms built models to analyze the effect of changes in activities but only with limited activities such as logistics. The purpose of this research is to create a general framework that can express the cost data for the partners of the supply chain in similar terms. The framework will layout the common activities identified within the firm and the relationship of these activities between the partners of the supply chain, and the framework will identify the effect of changes in activities on other partners within the supply chain. Cost information will help in making decisions about pricing, outsourcing, capital expenditures, and operational efficiency. The framework will be able to track cost through the chain, which will improve the flexibility of the supply chain to respond to rapidly changing technology. The framework will help in developing product strategy paradigms that encompass the dynamics of the market, in particular with respect to the technology adoption lifecycle.
Show less - Date Issued
- 2006
- Identifier
- CFE0001038, ucf:46821
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001038
- Title
- ITERCHANGING DISCRETE EVENT SIMULATIONPROCESS INTERACTION MODELSUSING THE WEB ONTOLOGY LANGUAGE - OWL.
- Creator
-
Lacy, Lee, Sepulveda, Jose, University of Central Florida
- Abstract / Description
-
Discrete event simulation development requires significant investments in time and resources. Descriptions of discrete event simulation models are associated with world views, including the process interaction orientation. Historically, these models have been encoded using high-level programming languages or special purpose, typically vendor-specific, simulation languages. These approaches complicate simulation model reuse and interchange. The current document-centric World Wide Web is...
Show moreDiscrete event simulation development requires significant investments in time and resources. Descriptions of discrete event simulation models are associated with world views, including the process interaction orientation. Historically, these models have been encoded using high-level programming languages or special purpose, typically vendor-specific, simulation languages. These approaches complicate simulation model reuse and interchange. The current document-centric World Wide Web is evolving into a Semantic Web that communicates information using ontologies. The Web Ontology Language OWL, was used to encode a Process Interaction Modeling Ontology for Discrete Event Simulations (PIMODES). The PIMODES ontology was developed using ontology engineering processes. Software was developed to demonstrate the feasibility of interchanging models from commercial simulation packages using PIMODES as an intermediate representation. The purpose of PIMODES is to provide a vendor-neutral open representation to support model interchange. Model interchange enables reuse and provides an opportunity to improve simulation quality, reduce development costs, and reduce development times.
Show less - Date Issued
- 2006
- Identifier
- CFE0001353, ucf:46977
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001353
- Title
- AUTOMATIC GENERATION OF SUPPLY CHAIN SIMULATION MODELS FROM SCOR BASED ONTOLOGIES.
- Creator
-
Cope, Dayana, Sepulveda, Jose, University of Central Florida
- Abstract / Description
-
In today's economy of global markets, supply chain networks, supplier/customer relationship management and intense competition; decision makers are faced with a need to perform decision making using tools that do not accommodate the nature of the changing market. This research focuses on developing a methodology that addresses this need. The developed methodology provides supply chain decision makers with a tool to perform efficient decision making in stochastic, dynamic and distributed...
Show moreIn today's economy of global markets, supply chain networks, supplier/customer relationship management and intense competition; decision makers are faced with a need to perform decision making using tools that do not accommodate the nature of the changing market. This research focuses on developing a methodology that addresses this need. The developed methodology provides supply chain decision makers with a tool to perform efficient decision making in stochastic, dynamic and distributed supply chain environments. The integrated methodology allows for informed decision making in a fast, sharable and easy to use format. The methodology was implemented by developing a stand alone tool that allows users to define a supply chain simulation model using SCOR based ontologies. The ontology includes the supply chain knowledge and the knowledge required to build a simulation model of the supply chain system. A simulation model is generated automatically from the ontology to provide the flexibility to model at various levels of details changing the model structure on the fly. The methodology implementation is demonstrated and evaluated through a retail oriented case study. When comparing the implementation using the developed methodology vs. a "traditional" simulation methodology approach, a significant reduction in definition and execution time was observed.
Show less - Date Issued
- 2008
- Identifier
- CFE0002009, ucf:47625
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0002009
- Title
- NEW HEURISTICS FOR THE 0-1 MULTI-DIMENSIONAL KNAPSACK PROBLEMS.
- Creator
-
Akin, Haluk, Sepulveda, Jose, University of Central Florida
- Abstract / Description
-
This dissertation introduces new heuristic methods for the 0-1 multi-dimensional knapsack problem (0-1 MKP). 0-1 MKP can be informally stated as the problem of packing items into a knapsack while staying within the limits of different constraints (dimensions). Each item has a profit level assigned to it. They can be, for instance, the maximum weight that can be carried, the maximum available volume, or the maximum amount that can be afforded for the items. One main assumption is that we have...
Show moreThis dissertation introduces new heuristic methods for the 0-1 multi-dimensional knapsack problem (0-1 MKP). 0-1 MKP can be informally stated as the problem of packing items into a knapsack while staying within the limits of different constraints (dimensions). Each item has a profit level assigned to it. They can be, for instance, the maximum weight that can be carried, the maximum available volume, or the maximum amount that can be afforded for the items. One main assumption is that we have only one item of each type, hence the problem is binary (0-1). The single dimensional version of the 0-1 MKP is the uni-dimensional single knapsack problem which can be solved in pseudo-polynomial time. However the 0-1 MKP is a strongly NP-Hard problem. Reduced cost values are rarely used resources in 0-1 MKP heuristics; using reduced cost information we introduce several new heuristics and also some improvements to past heuristics. We introduce two new ordering strategies, decision variable importance (DVI) and reduced cost based ordering (RCBO). We also introduce a new greedy heuristic concept which we call the "sliding concept" and a sub-branch of the "sliding concept" which we call "sliding enumeration". We again use the reduced cost values within the sliding enumeration heuristic. RCBO is a brand new ordering strategy which proved useful in several methods such as improving Pirkul's MKHEUR, a triangular distribution based probabilistic approach, and our own sliding enumeration. We show how Pirkul's shadow price based ordering strategy fails to order the partial variables. We present a possible fix to this problem since there tends to be a high number of partial variables in hard problems. Therefore, this insight will help future researchers solve hard problems with more success. Even though sliding enumeration is a trivial method it found optima in less than a few seconds for most of our problems. We present different levels of sliding enumeration and discuss potential improvements to the method. Finally, we also show that in meta-heuristic approaches such as Drexl's simulated annealing where random numbers are abundantly used, it would be better to use better designed probability distributions instead of random numbers.
Show less - Date Issued
- 2009
- Identifier
- CFE0002633, ucf:48195
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0002633
- Title
- Systems Geometry: A Methodology for Analyzing Emergent System of Systems Behaviors.
- Creator
-
Bouwens, Christina, Sepulveda, Jose, Karwowski, Waldemar, Xanthopoulos, Petros, Kapucu, Naim, University of Central Florida
- Abstract / Description
-
Recent advancements in technology have led to the increased use of integrated 'systems of systems' (SoS) which link together independently developed and usable capabilities into an integrated system that exhibits new, emergent capabilities. However, the resulting SoS is often not well understood, where secondary and tertiary effects of tying systems together are often unpredictable and present severe consequences. The complexities of the composed system stem not only from system integration,...
Show moreRecent advancements in technology have led to the increased use of integrated 'systems of systems' (SoS) which link together independently developed and usable capabilities into an integrated system that exhibits new, emergent capabilities. However, the resulting SoS is often not well understood, where secondary and tertiary effects of tying systems together are often unpredictable and present severe consequences. The complexities of the composed system stem not only from system integration, but from a broad range of areas such as the competing objectives of different constituent system stakeholders, mismatched requirements from multiple process models, and architectures and interface approaches that are incompatible on multiple levels. While successful SoS development has proven to be a valuable tool for a wide range of applications, there are significant problems that remain with the development of such systems that need to be addressed during the early stages of engineering development within such environments. The purpose of this research is to define and demonstrate a methodology called Systems Geometry (SG) for analyzing SoS in the early stages of development to identify areas of potential unintended emergent behaviors as candidates for the employment of risk management strategies. SG focuses on three dimensions of interest when planning the development of a SoS: operational, functional, and technical. For Department of Defense (DoD) SoS, the operational dimension addresses the warfighter environment and includes characteristics such as mission threads and related command and control or simulation activities required to support the mission. The functional dimension highlights different roles associated with the development and use of the SoS, which could include a participant warfighter using the system, an analyst collecting data for system evaluation, or an infrastructure engineer working to keep the SoS infrastructure operational to support the users. Each dimension can be analyzed to understand roles, interfaces and activities. Cross-dimensional effects are of particular interest since such effects are less detectable and generally not addressed with conventional systems engineering (SE) methods. The literature review and the results of this study have identified key characteristics or dimensions that should be examined during SoS analysis and design. Although many methods exist for exploring system dimensions, there is a gap in techniques to explore cross-dimensional interactions and their effect on emergent SoS behaviors. The study has resulted in a methodology for capturing dimensional information and recommended analytical methods for intra-dimensional as well as cross-dimensional analysis. A problem-based approach to the system analysis is recommended combined with the application of matrix methods, network analysis and modeling techniques to provide intra- and cross-dimensional insight. The results of this research are applicable to a variety of socio-technical SoS analyses with applications in analysis, experimentation, test and evaluation and training.
Show less - Date Issued
- 2013
- Identifier
- CFE0005135, ucf:50696
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005135
- Title
- A Dynamic Enrollment Simulation Model for Planning and Decision-Making in a University.
- Creator
-
Robledo, Luis, Sepulveda, Jose, Kincaid, John, Armacost, Robert, Archer, Sandra, University of Central Florida
- Abstract / Description
-
Decision support systems for university management have had limited improvement in the incorporation of new cutting-edge techniques. Most decision-makers use traditional forecasting methods to base their decisions in order to maintain financially affordable programs and keep universities competitive for the last few decades. Strategic planning for universities has always been related to enrollment revenues, and operational expenses. Enrollment models in use today are able to represent...
Show moreDecision support systems for university management have had limited improvement in the incorporation of new cutting-edge techniques. Most decision-makers use traditional forecasting methods to base their decisions in order to maintain financially affordable programs and keep universities competitive for the last few decades. Strategic planning for universities has always been related to enrollment revenues, and operational expenses. Enrollment models in use today are able to represent forecasting based on historical data, considering usual variables like student headcount, student credit, among others. No consideration is given to students' preferences. Retention models, associated to enrollment, deal with average retention times leaving off preferences as well.Preferences play a major role at institutions where students are not required to declare their intentions (major) immediately. Even if they do, they may change it if they find another, more attractive major, or they may even decide to leave college for external reasons.Enrollment models have been identified to deal with three main purposes: prediction of income from tuition (in-state, out-of-state), planning of future courses and curriculum, and allocation of resources to academic departments, This general perspective does not provide useful information to faculty and Departments for detailed planning and allocation of resources for the next term or year. There is a need of new metrics to help faculty and Departments to reach a detailed and useful level in order to effectively plan this allocation of resources. The dynamics in the rate-of-growth, the preferences students have for certain majors at a specific point of time, or economic hardship make a difference when decisions have to be made for budgets requests, hiring of faculty, classroom assignment, parking, transportation, or even building new facilities. Existing models do not make difference between these variables.This simulation model is a hybrid model that considers the use of System Dynamics, Discrete-event and Agent-based simulation, which allows the representation of the general enrollment process at the University level (strategic decisions), and enrollment, retention and major selection at the College (tactical decisions) and Department level (operational decisions). This approach allows lower level to more accurately predict the number of students retained for next term or year, while allowing upper levels to decide on new students to admit (first time in college and transfers) and results in recommendations on faculty hiring, class or labs assignment, and resource allocation.This model merges both high and low levels of student's enrollment models into one application, allowing not only representation of the current overall enrollment, but also prediction at the College and Department level. This provides information on optimal classroom assignments, faculty and student resource allocation.
Show less - Date Issued
- 2013
- Identifier
- CFE0005055, ucf:49970
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005055
- Title
- Multi-Vehicle Dispatching and Routing with Time Window Constraints and Limited Dock Capacity.
- Creator
-
El-Nashar, Ahmed, Nazzal, Dima, Sepulveda, Jose, Geiger, Christopher, Hosni, Yasser, University of Central Florida
- Abstract / Description
-
The Vehicle Routing Problem with Time Windows (VRPTW) is an important and computationally hard optimization problem frequently encountered in Scheduling and logistics. The Vehicle Routing Problem (VRP) can be described as the problem of designing the most efficient and economical routes from one depot to a set of customers using a limited number of vehicles. This research addresses the VRPTW under the following additional complicating features that are often encountered in practical problems...
Show moreThe Vehicle Routing Problem with Time Windows (VRPTW) is an important and computationally hard optimization problem frequently encountered in Scheduling and logistics. The Vehicle Routing Problem (VRP) can be described as the problem of designing the most efficient and economical routes from one depot to a set of customers using a limited number of vehicles. This research addresses the VRPTW under the following additional complicating features that are often encountered in practical problems:1. Customers have strict time windows for receiving a vehicle, i.e., vehicles are not allowed to arrive at the customer's location earlier than the lower limit of the specified time window, which is relaxed in previous research work.2. There is a limited number of loading/unloading docks for dispatching/receiving the vehicles at the depotThe main goal of this research is to propose a framework for solving the VRPTW with the constraints stated above by generating near-optimal routes for the vehicles so as to minimize the total traveling distance. First, the proposed framework clusters customers into groups based on their proximity to each other. Second, a Probabilistic Route Generation (PRG) algorithm is applied to each cluster to find the best route for visiting customers by each vehicle; multiple routes per vehicle are generated and each route is associated with a set of feasible dispatching times from the depot. Third, an assignment problem formulation determines the best dispatching time and route for each vehicle that minimizes the total traveling distance.iiiThe proposed algorithm is tested on a set of benchmark problems that were originally developed by Marius M. Solomon and the results indicate that the algorithm works well with about 1.14% average deviation from the best-known solutions. The benchmark problems are then modified by adjusting some of the customer time window limits, and adding the staggered vehicle dispatching constraint. For demonstration purposes, the proposed clustering and PRG algorithms are then applied to the modified benchmark problems.
Show less - Date Issued
- 2012
- Identifier
- CFE0004532, ucf:49233
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004532
- Title
- An Index to Measure Efficiency of Hospital Networks for Mass Casualty Disasters.
- Creator
-
Bull Torres, Maria, Sepulveda, Jose, Sala-Diakanda, Serge, Geiger, Christopher, Kapucu, Naim, University of Central Florida
- Abstract / Description
-
Disaster events have emphasized the importance of healthcare response activities due to the large number of victims. For instance, Hurricane Katrina in New Orleans, in 2005, and the terrorist attacks in New York City and Washington, D.C., on September 11, 2001, left thousands of wounded people. In those disasters, although hospitals had disaster plans established for more than a decade, their plans were not efficient enough to handle the chaos produced by the hurricane and terrorist attacks....
Show moreDisaster events have emphasized the importance of healthcare response activities due to the large number of victims. For instance, Hurricane Katrina in New Orleans, in 2005, and the terrorist attacks in New York City and Washington, D.C., on September 11, 2001, left thousands of wounded people. In those disasters, although hospitals had disaster plans established for more than a decade, their plans were not efficient enough to handle the chaos produced by the hurricane and terrorist attacks. Thus, the Joint Commission on Accreditation of Healthcare Organizations (JCAHO) suggested collaborative planning among hospitals that provide services to a contiguous geographic area during mass casualty disasters. However, the JCAHO does not specify a methodology to determine which hospitals should be included into these cooperative plans. As a result, the problem of selecting the right hospitals to include in exercises and drills at the county level is a common topic in the current preparedness stages. This study proposes an efficiency index to determine the efficient response of cooperative-networks among hospitals before an occurrence of mass casualty disaster. The index built in this research combines operations research techniques, and the prediction of this index used statistical analysis. The consecutive application of three different techniques: network optimization, data envelopment analysis (DEA), and regression analysis allowed to obtain a regression equation to predict efficiency in predefined hospital networks for mass casualty disasters. In order to apply the proposed methodology for creating an efficiency index, we selected the Orlando area, and we defined three disaster sizes. Then, we designed networks considering two perspectives, hub-hospital and hub-disaster networks. In both optimization network models the objective function pursued to: reduce the travel distance and the emergency department (ED) waiting time in hospitals, increase the number of services offered by hospitals in the network, and offer specialized assistance to children. The hospital network optimization generated information for 75 hospital networks in Orlando. The DEA analyzed these 75 hospital networks, or decision making units (DMU's), to estimate their comparative efficiency. Two DEAs were performed in this study. As an output variable for each DMU, the DEA-1 considered the number of survivors allocated in less than a 40 miles range. As the input variables, the DEA-1 included: (i) The number of beds available in the network; (ii) The number of hospitals available in the network; and (iii) The number of services offered by hospitals in the network. This DEA-1 allowed the assignment of an efficiency value to each of the 75 hospital networks. As output variables for each DMU, the DEA-2 considered the number of survivors allocated in less than a 40 miles range and an index for ED waiting time in the network. The input variables included in DEA-2 are (i) The number of beds available in the network; (ii) The number of hospitals available in the network; and (iii) The number of services offered by hospitals in the network. These DEA allowed the assignment of an efficiency value to each of the 75 hospital networks. This efficiency index should allow emergency planners and hospital managers to assess which hospitals should be associated in a cooperative network in order to transfer survivors. Furthermore, JCAHO could use this index to evaluate the cooperating emergency hospitals' plans.However, DEA is a complex methodology that requires significant data gathering and handling. Thus, we studied whether a simpler regression analysis would substantially yield the same results. DEA-1 can be predicted using two regression analyses, which concluded that the average distances between hospitals and the disaster locations, and the size of the disaster explain the efficiency of the hospital network. DEA-2 can be predicted using three regressions, which included size of the disaster, number of hospitals, average distance, and average ED waiting time, as predictors of hospital network efficiency. The models generated for DEA-1 and DEA-2 had a mean absolute percent error (MAPE) around 10%. Thus, the indexes developed through the regression analysis make easier the estimation of the efficiency in predefined hospital networks, generating suitable predictors of the efficiency as determined by the DEA analysis. In conclusion, network optimization, DEA, and regressions analyses can be combined to create an index of efficiency to measure the performance of predefined-hospital networks in a mass casualty disaster, validating the hypothesis of this research.Although the methodology can be applied to any county or city, the regressions proposed for predicting the efficiency of hospital network estimated by DEA can be applied only if the city studied has the same characteristics of the Orlando area. These conditions include the following: (i) networks must have a rate of services lager than 0.76; (ii) the number of survivors must be less than 47% of the bed capacity EDs of the area studied; (iii) all hospitals in the network must have ED and they must be located in less than 48 miles range from the disaster sites, and (iv) EDs should not have more than 60 minutes of waiting time.The proposed methodology, in special the efficiency index, support the operational objectives of the 2012 ESF#8 for Florida State to handle risk and response capabilities conducting and participating in training and exercises to test and improve plans and procedures in the health response.
Show less - Date Issued
- 2012
- Identifier
- CFE0004524, ucf:49290
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004524
- Title
- Defining a Stakeholder-Relative Model to Measure Academic Department Efficiency at Achieving Quality in Higher Education.
- Creator
-
Robinson, Federica, Sepulveda, Jose, Reilly, Charles, Nazzal, Dima, Armacost, Robert, Feldheim, Mary, University of Central Florida
- Abstract / Description
-
In a time of strained resources and dynamic environments, the importance of effective and efficient systems is critical. This dissertation was developed to address the need to use feedback from multiple stakeholder groups to define quality and assess an entity's efficiency at achieving such quality.A decision support model with applicability to diverse domains was introduced to outline the approach. Three phases, (1) quality model development, (2) input-output selection and (3) relative...
Show moreIn a time of strained resources and dynamic environments, the importance of effective and efficient systems is critical. This dissertation was developed to address the need to use feedback from multiple stakeholder groups to define quality and assess an entity's efficiency at achieving such quality.A decision support model with applicability to diverse domains was introduced to outline the approach. Three phases, (1) quality model development, (2) input-output selection and (3) relative efficiency assessment, captured the essence of the process which also delineates the approach per tool applied. This decision support model was adapted in higher education to assess academic departmental efficiency at achieving stakeholder-relative quality. Phase 1 was accomplished through a three round, Delphi-like study which involved user group refinement. Those results were compared to the criteria of an engineering accreditation body (ABET) to support the model's validity to capture quality in the College of Engineering (&) Computer Science, its departments and programs. In Phase 2 the Analytic Hierarchy Process (AHP) was applied to the validated model to quantify the perspective of students, administrators, faculty and employers (SAFE). Using the composite preferences for the collective group (n=74), the model was limited to the top 7 attributes which accounted for about 55% of total preferences. Data corresponding to the resulting variables, referred to as key performance indicators, was collected using various information sources and infused in the data envelopment analysis (DEA) methodology (Phase 3). This process revealed both efficient and inefficient departments while offering transparency of opportunities to maximize quality outputs. Findings validate the potential of the Delphi-like, analytic hierarchical, data envelopment analysis approach for administrative decision-making in higher education. However, the availability of more meaningful metrics and data is required to adapt the model for decision making purposes. Several recommendations were included to improve the usability of the decision support model and future research opportunities were identified to extend the analyses inherent and apply the model to alternative areas.
Show less - Date Issued
- 2013
- Identifier
- CFE0004921, ucf:49636
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004921