Current Search: multi-objective (x)
View All Items
- Title
- A MULTI-OBJECTIVE NO-REGRET DECISION MAKING MODEL WITH BAYESIAN LEARNING FOR AUTONOMOUS UNMANNED SYSTEMS.
- Creator
-
Howard, Matthew, Qu, Zhihua, University of Central Florida
- Abstract / Description
-
The development of a multi-objective decision making and learning model for the use in unmanned systems is the focus of this project. Starting with traditional game theory and psychological learning theories developed in the past, a new model for machine learning is developed. This model incorporates a no-regret decision making model with a Bayesian learning process which has the ability to adapt to errors found in preconceived costs associated with each objective. This learning ability is...
Show moreThe development of a multi-objective decision making and learning model for the use in unmanned systems is the focus of this project. Starting with traditional game theory and psychological learning theories developed in the past, a new model for machine learning is developed. This model incorporates a no-regret decision making model with a Bayesian learning process which has the ability to adapt to errors found in preconceived costs associated with each objective. This learning ability is what sets this model apart from many others. By creating a model based on previously developed human learning models, hundreds of years of experience in these fields can be applied to the recently developing field of machine learning. This also allows for operators to more comfortably adapt to the machine's learning process in order to better understand how to take advantage of its features. One of the main purposes of this system is to incorporate multiple objectives into a decision making process. This feature can better allow its users to clearly define objectives and prioritize these objectives allowing the system to calculate the best approach for completing the mission. For instance, if an operator is given objectives such as obstacle avoidance, safety, and limiting resource usage, the operator would traditionally be required to decide how to meet all of these objectives. The use of a multi-objective decision making process such as the one designed in this project, allows the operator to input the objectives and their priorities and receive an output of the calculated optimal compromise.
Show less - Date Issued
- 2008
- Identifier
- CFE0002453, ucf:47711
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0002453
- Title
- MULTIOBJECTIVE COORDINATION MODELS FOR MAINTENANCE AND SERVICE PARTS INVENTORY PLANNING AND CONTROL.
- Creator
-
Martinez, Oscar, Geiger, Christopher, University of Central Florida
- Abstract / Description
-
In many equipment-intensive organizations in the manufacturing, service and particularly the defense sectors, service parts inventories constitute a significant source of tactical and operational costs and consume a significant portion of capital investment. For instance, the Defense Logistics Agency manages about 4 million consumable service parts and provides about 93% of all consumable service parts used by the military services. These items required about US$1.9 billion over the fiscal...
Show moreIn many equipment-intensive organizations in the manufacturing, service and particularly the defense sectors, service parts inventories constitute a significant source of tactical and operational costs and consume a significant portion of capital investment. For instance, the Defense Logistics Agency manages about 4 million consumable service parts and provides about 93% of all consumable service parts used by the military services. These items required about US$1.9 billion over the fiscal years 1999-2002. During the same time, the US General Accountability Office discovered that, in the United States Navy, there were about 3.7 billion ship and submarine parts that were not needed. The Federal Aviation Administration says that 26 million aircraft parts are changed each year. In 2002, the holding cost of service parts for the aviation industry was estimated to be US$50 billion. The US Army Institute of Land Warfare reports that, at the beginning of the 2003 fiscal year, prior to Operation Iraqi Freedom the aviation service parts alone was in excess of US$1 billion. This situation makes the management of these items a very critical tactical and strategic issue that is worthy of further study. The key challenge is to maintain high equipment availability with low service cost (e.g., holding, warehousing, transportation, technicians, overhead, etc.). For instance, despite reporting US$10.5 billion in appropriations spent on purchasing service parts in 2000, the United States Air Force (USAF) continues to report shortages of service parts. The USAF estimates that, if the investment on service parts decreases to about US$5.3 billion, weapons systems availability would range from 73 to 100 percent. Thus, better management of service parts inventories should create opportunities for cost savings caused by the efficient management of these inventories. Unfortunately, service parts belong to a class of inventory that continually makes them difficult to manage. Moreover, it can be said that the general function of service parts inventories is to support maintenance actions; therefore, service parts inventory policies are highly related to the resident maintenance policies. However, the interrelationship between service parts inventory management and maintenance policies is often overlooked, both in practice and in the academic literature, when it comes to optimizing maintenance and service parts inventory policies. Hence, there exists a great divide between maintenance and service parts inventory theory and practice. This research investigation specifically considers the aspect of joint maintenance and service part inventory optimization. We decompose the joint maintenance and service part inventory optimization problem into the supplier's problem and the customer's problem. Long-run expected cost functions for each problem that include the most common maintenance cost parameters and service parts inventory cost parameters are presented. Computational experiments are conducted for a single-supplier two-echelon service parts supply chain configuration varying the number of customers in the network. Lateral transshipments (LTs) of service parts between customers are not allowed. For this configuration, we optimize the cost functions using a traditional, or decoupled, approach, where each supply chain entity optimizes its cost individually, and a joint approach, where the cost objectives of both the supplier and customers are optimized simultaneously. We show that the multiple objective optimization approach outperforms the traditional decoupled optimization approach by generating lower system-wide supply chain network costs. The model formulations are extended by relaxing the assumption of no LTs between customers in the supply chain network. Similar to those for the no LTs configuration, the results for the LTs configuration show that the multiobjective optimization outperforms the decoupled optimization in terms of system-wide cost. Hence, it is economically beneficial to jointly consider all parties within the supply network. Further, we compare the model configurations LTs versus no LTs, and we show that using LTs improves the overall savings of the system. It is observed that the improvement is mostly derived from reduced shortage costs since the equipment downtime is reduced due to the proximity of the supply. The models and results of this research have significant practical implications as they can be used to assist decision-makers to determine when and where to pre-position parts inventories to maximize equipment availability. Furthermore, these models can assist in the preparation of the terms of long-term service agreements and maintenance contracts between original equipment manufacturers and their customers (i.e., equipment owners and/or operators), including determining the equitable allocation of all system-wide cost savings under the agreement.
Show less - Date Issued
- 2008
- Identifier
- CFE0002459, ucf:47723
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0002459
- Title
- MESHLESS HEMODYNAMICS MODELING AND EVOLUTIONARY SHAPE OPTIMIZATION OF BYPASS GRAFTS ANASTOMOSES.
- Creator
-
El Zahab, Zaher, Kassab, Alain, University of Central Florida
- Abstract / Description
-
Objectives: The main objective of the current dissertation is to establish a formal shape optimization procedure for a given bypass grafts end-to-side distal anastomosis (ETSDA). The motivation behind this dissertation is that most of the previous ETSDA shape optimization research activities cited in the literature relied on direct optimization approaches that do not guaranty accurate optimization results. Three different ETSDA models are considered herein: The conventional, the Miller cuff,...
Show moreObjectives: The main objective of the current dissertation is to establish a formal shape optimization procedure for a given bypass grafts end-to-side distal anastomosis (ETSDA). The motivation behind this dissertation is that most of the previous ETSDA shape optimization research activities cited in the literature relied on direct optimization approaches that do not guaranty accurate optimization results. Three different ETSDA models are considered herein: The conventional, the Miller cuff, and the hood models. Materials and Methods: The ETSDA shape optimization is driven by three computational objects: a localized collocation meshless method (LCMM) solver, an automated geometry pre-processor, and a genetic-algorithm-based optimizer. The usage of the LCMM solver is very convenient to set an autonomous optimization mechanism for the ETSDA models. The task of the automated pre-processor is to randomly distribute solution points in the ETSDA geometries. The task of the optimized is the adjust the ETSDA geometries based on mitigation of the abnormal hemodynamics parameters. Results: The results reported in this dissertation entail the stabilization and validation of the LCMM solver in addition to the shape optimization of the considered ETSDA models. The LCMM stabilization results consists validating a custom-designed upwinding scheme on different one-dimensional and two-dimensional test cases. The LCMM validation is done for incompressible steady and unsteady flow applications in the ETSDA models. The ETSDA shape optimization include single-objective optimization results in steady flow situations and bi-objective optimization results in pulsatile flow situations. Conclusions: The LCMM solver provides verifiably accurate resolution of hemodynamics and is demonstrated to be third order accurate in a comparison to a benchmark analytical solution of the Navier-Stokes. The genetic-algorithm-based shape optimization approach proved to be very effective for the conventional and Miller cuff ETSDA models. The shape optimization results for those two models definitely suggest that the graft caliber should be maximized whereas the anastomotic angle and the cuff height (in the Miller cuff model) should be chosen following a compromise between the wall shear stress spatial and temporal gradients. The shape optimization of the hood ETSDA model did not prove to be advantageous, however it could be meaningful with the inclusion of the suture line cut length as an optimization parameter.
Show less - Date Issued
- 2008
- Identifier
- CFE0002165, ucf:47927
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0002165
- Title
- Multi-Objective Optimization for Construction Equipment Fleet Selection and Management In Highway Construction Projects Based on Time, Cost, and Quality Objectives.
- Creator
-
Shehadeh, Ali, Tatari, Omer, Al-Deek, Haitham, Abou-Senna, Hatem, Flitsiyan, Elena, University of Central Florida
- Abstract / Description
-
The sector of highway construction shares approximately 11% of the total construction industry in the US. Construction equipment can be considered as one of the primary reasons this industry has reached such a significant level, as it is considered an essential part of the highway construction process during highway project construction. This research addresses a multi-objective optimization mathematical model that quantifies and optimize the key parameters for excavator, truck, and motor...
Show moreThe sector of highway construction shares approximately 11% of the total construction industry in the US. Construction equipment can be considered as one of the primary reasons this industry has reached such a significant level, as it is considered an essential part of the highway construction process during highway project construction. This research addresses a multi-objective optimization mathematical model that quantifies and optimize the key parameters for excavator, truck, and motor-grader equipment to minimize time and cost objective functions. The model is also aimed to maintain the required level of quality for the targeted construction activity. The mathematical functions for the primary objectives were formulated and then a genetic algorithm-based multi-objective was performed to generate the time-cost Pareto trade-offs for all possible equipment combinations using MATLAB software to facilitate the implementation. The model's capabilities in generating optimal time and cost trade-offs based on optimized equipment number, capacity, and speed to adapt with the complex and dynamic nature of highway construction projects are demonstrated using a highway construction case study. The developed model is a decision support tool during the construction process to adapt with any necessary changes into time or cost requirements taking into consideration environmental, safety and quality aspects. The flexibility and comprehensiveness of the proposed model, along with its programmable nature, make it a powerful tool for managing construction equipment, which will help saving time and money within the optimal quality margins. Also, this environmentally friendly decision-support tool model provided optimal solutions that help to reduce the CO2 emissions reducing the ripple effects of targeted highway construction activities on the global warming phenomenon. The generated optimal solutions offered considerable time and cost savings.
Show less - Date Issued
- 2019
- Identifier
- CFE0007863, ucf:52800
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007863
- Title
- Model Selection via Racing.
- Creator
-
Zhang, Tiantian, Georgiopoulos, Michael, Anagnostopoulos, Georgios, Wu, Annie, Hu, Haiyan, Nickerson, David, University of Central Florida
- Abstract / Description
-
Model Selection (MS) is an important aspect of machine learning, as necessitated by the No Free Lunch theorem. Briefly speaking, the task of MS is to identify a subset of models that are optimal in terms of pre-selected optimization criteria. There are many practical applications of MS, such as model parameter tuning, personalized recommendations, A/B testing, etc. Lately, some MS research has focused on trading off exactness of the optimization with somewhat alleviating the computational...
Show moreModel Selection (MS) is an important aspect of machine learning, as necessitated by the No Free Lunch theorem. Briefly speaking, the task of MS is to identify a subset of models that are optimal in terms of pre-selected optimization criteria. There are many practical applications of MS, such as model parameter tuning, personalized recommendations, A/B testing, etc. Lately, some MS research has focused on trading off exactness of the optimization with somewhat alleviating the computational burden entailed. Recent attempts along this line include metaheuristics optimization, local search-based approaches, sequential model-based methods, portfolio algorithm approaches, and multi-armed bandits.Racing Algorithms (RAs) are an active research area in MS, which trade off some computational cost for a reduced, but acceptable likelihood that the models returned are indeed optimal among the given ensemble of models. All existing RAs in the literature are designed as Single-Objective Racing Algorithm (SORA) for Single-Objective Model Selection (SOMS), where a single optimization criterion is considered for measuring the goodness of models. Moreover, they are offline algorithms in which MS occurs before model deployment and the selected models are optimal in terms of their overall average performances on a validation set of problem instances. This work aims to investigate racing approaches along two distinct directions: Extreme Model Selection (EMS) and Multi-Objective Model Selection (MOMS). In EMS, given a problem instance and a limited computational budget shared among all the candidate models, one is interested in maximizing the final solution quality. In such a setting, MS occurs during model comparison in terms of maximum performance and involves no model validation. EMS is a natural framework for many applications. However, EMS problems remain unaddressed by current racing approaches. In this work, the first RA for EMS, named Max-Race, is developed, so that it optimizes the extreme solution quality by automatically allocating the computational resources among an ensemble of problem solvers for a given problem instance. In Max-Race, significant difference between the extreme performances of any pair of models is statistically inferred via a parametric hypothesis test under the Generalized Pareto Distribution (GPD) assumption. Experimental results have confirmed that Max-Race is capable of identifying the best extreme model with high accuracy and low computational cost. Furthermore, in machine learning, as well as in many real-world applications, a variety of MS problems are multi-objective in nature. MS which simultaneously considers multiple optimization criteria is referred to as MOMS. Under this scheme, a set of Pareto optimal models is sought that reflect a variety of compromises between optimization objectives. So far, MOMS problems have received little attention in the relevant literature. Therefore, this work also develops the first Multi-Objective Racing Algorithm (MORA) for a fixed-budget setting, namely S-Race. S-Race addresses MOMS in the proper sense of Pareto optimality. Its key decision mechanism is the non-parametric sign test, which is employed for inferring pairwise dominance relationships. Moreover, S-Race is able to strictly control the overall probability of falsely eliminating any non-dominated models at a user-specified significance level. Additionally, SPRINT-Race, the first MORA for a fixed-confidence setting, is also developed. In SPRINT-Race, pairwise dominance and non-dominance relationships are established via the Sequential Probability Ratio Test with an Indifference zone. Moreover, the overall probability of falsely eliminating any non-dominated models or mistakenly retaining any dominated models is controlled at a prescribed significance level. Extensive experimental analysis has demonstrated the efficiency and advantages of both S-Race and SPRINT-Race in MOMS.
Show less - Date Issued
- 2016
- Identifier
- CFE0006203, ucf:51094
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006203
- Title
- Global Data Association for Multiple Pedestrian Tracking.
- Creator
-
Dehghan, Afshin, Shah, Mubarak, Qi, GuoJun, Bagci, Ulas, Zhang, Shaojie, Zheng, Qipeng, University of Central Florida
- Abstract / Description
-
Multi-object tracking is one of the fundamental problems in computer vision. Almost all multi-object tracking systems consist of two main components; detection and data association. In the detection step, object hypotheses are generated in each frame of a sequence. Later, detections that belong to the same target are linked together to form final trajectories. The latter step is called data association. There are several challenges that render this problem difficult, such as occlusion,...
Show moreMulti-object tracking is one of the fundamental problems in computer vision. Almost all multi-object tracking systems consist of two main components; detection and data association. In the detection step, object hypotheses are generated in each frame of a sequence. Later, detections that belong to the same target are linked together to form final trajectories. The latter step is called data association. There are several challenges that render this problem difficult, such as occlusion, background clutter and pose changes. This dissertation aims to address these challenges by tackling the data association component of tracking and contributes three novel methods for solving data association. Firstly, this dissertation will present a new framework for multi-target tracking that uses a novel data association technique using the Generalized Maximum Clique Problem (GMCP) formulation. The majority of current methods, such as bipartite matching, incorporate a limited temporal locality of the sequence into the data association problem. This makes these methods inherently prone to ID-switches and difficulties caused by long-term occlusions, a cluttered background and crowded scenes. On the other hand, our approach incorporates both motion and appearance in a global manner. Unlike limited temporal locality methods which incorporate a few frames into the data association problem, this method incorporates the whole temporal span and solves the data association problem for one object at a time. Generalized Minimum Clique Graph (GMCP) is used to solve the optimization problem of our data association method. The proposed method is supported by superior results on several benchmark sequences. GMCP leads us to a more accurate approach to multi-object tracking by considering all the pairwise relationships in a batch of frames; however, it has some limitations. Firstly, it finds target trajectories one-by-one, missing joint optimization. Secondly, for optimization we use a greedy solver, based on local neighborhood search, making our optimization prone to local minimas. Finally GMCP tracker is slow, which is a burden when dealing with time-sensitive applications. In order to address these problems, we propose a new graph theoretic problem, called Generalized Maximum Multi Clique Problem (GMMCP). GMMCP tracker has all the advantages of the GMCP tracker while addressing its limitations. A solution is presented to GMMCP where no simplification is assumed in problem formulation or problem optimization. GMMCP is NP hard but it can be formulated through a Binary-Integer Program where the solution to small- and medium-sized tracking problems can be found efficiently. To improve speed, Aggregated Dummy Nodes are used for modeling occlusions and miss detections. This also reduces the size of the input graph without using any heuristics. We show that using the speed-up method, our tracker lends itself to a real-time implementation, increasing its potential usefulness in many applications. In test against several tracking datasets, we show that the proposed method outperforms competitive methods. Thus far we have assumed that the number of people do not exceed a few dozens. However, this is not always the case. In many scenarios such as, marathon, political rallies or religious rites, the number of people in a frame may reach few hundreds or even few thousands. Tracking in high-density crowd sequences is a challenging problem due to several reasons. Human detection methods often fail to localize objects correctly in extremely crowded scenes. This limits the use of data association based tracking methods. Additionally, it is hard to extend existing multi-target tracking to track targets in highly-crowded scenes, because the large number of targets increases the computational complexity. Furthermore, the small apparent target size makes it challenging to extract features to discriminate targets from their surroundings. Finally, we present a tracker that addresses the above-mentioned problems. We formulate online crowd tracking as a Binary Quadratic Programing, where both detection and data association problems are solved together. Our formulation employs target's individual information in the form of appearance and motion as well as contextual cues in the form of neighborhood motion, spatial proximity and grouping constraints. Due to large number of targets, state-of-the-art commercial quadratic programing solvers fail to efficiently find the solution to the proposed optimization. In order to overcome the computational complexity of available solvers, we propose to use the most recent version of Modified Frank-Wolfe algorithms with SWAP steps. The proposed tracker can track hundreds of targets efficiently and improves state-of-the-art results by significant margin on high density crowd sequences.
Show less - Date Issued
- 2016
- Identifier
- CFE0006095, ucf:51201
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006095
- Title
- Streamflow prediction in ungauged basins located within data-scarce regions.
- Creator
-
Alipour, Mohammadhossein, Kibler, Kelly, Wang, Dingbao, Mayo, Talea, Emrich, Christopher, University of Central Florida
- Abstract / Description
-
Preservation and or restoration of riverine ecosystem requires quantification of alterations inflicted by water resources development projects. Long records of streamflow data are the first piece of information required in order to enable this analysis. Ungauged catchments located within data-scarce regions lack long records of streamflow data. In this dissertation, a multi-objective framework named Streamflow Prediction under Extreme Data-scarcity (SPED) is proposed for streamflow prediction...
Show morePreservation and or restoration of riverine ecosystem requires quantification of alterations inflicted by water resources development projects. Long records of streamflow data are the first piece of information required in order to enable this analysis. Ungauged catchments located within data-scarce regions lack long records of streamflow data. In this dissertation, a multi-objective framework named Streamflow Prediction under Extreme Data-scarcity (SPED) is proposed for streamflow prediction in ungauged catchments located within large-scale regions of minimal hydrometeorologic observation. Multi-objective nature of SPED allows for balancing runoff efficiency with selection of parameter values that resemble catchment physical characteristics. Uncertain and low-resolution information are incorporated in SPED as soft data along with sparse observations. SPED application in two catchments in southwestern China indicates high runoff efficiency for predictions and good estimation of soil moisture capacity in the catchments. SPED is then slightly modified and tested more comprehensively by application to six catchments with diverse hydroclimatic conditions. SPED performance proves satisfactory where traditional flow prediction approaches fail. SPED also proves comparable or even better than data-intensive approaches. Utility of SPED versus a simpler catchment similarity model for the study of flow regime alteration is pursued next by streamflow prediction in 32 rivers in southwestern China. The results indicate that diversion adversely alters the flow regime of the rivers while direction and pattern of change remain the same regardless of the flow prediction method of choice. However, the results based on SPED consistently indicate more substantial alterations to the flow regime of the rivers after diversion. Finally, the value added by a limited number of streamflow observations to improvement of predictions in an ungauged catchment located within a data-scarce region is studied. The large number of test scenarios indicate that there may be very few near-universal schemes to improve flow predictions in such catchments.
Show less - Date Issued
- 2019
- Identifier
- CFE0007426, ucf:52713
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007426
- Title
- Hybrid Multi-Objective Optimization of Left Ventricular Assist Device Outflow Graft Anastomosis Orientation to Minimize Stroke Rate.
- Creator
-
Lozinski, Blake, Kassab, Alain, Mansy, Hansen, DeCampli, William, University of Central Florida
- Abstract / Description
-
A Left Ventricular Assist Device (LVAD) is a mechanical pump that is utilized as a bridge to transplantation for patients with a Heart Failure (HF) condition. More recently, LVADs have been also used as destination therapy and have provided an increase in the quality of life for patients with HF. However, despite improvements in VAD design and anticoagulation treatment, there remains a significant problem with VAD therapy, namely drive line infection and thromboembolic events leading to...
Show moreA Left Ventricular Assist Device (LVAD) is a mechanical pump that is utilized as a bridge to transplantation for patients with a Heart Failure (HF) condition. More recently, LVADs have been also used as destination therapy and have provided an increase in the quality of life for patients with HF. However, despite improvements in VAD design and anticoagulation treatment, there remains a significant problem with VAD therapy, namely drive line infection and thromboembolic events leading to stroke. This thesis focuses on a surgical maneuver to address the second of these issues, guided by previous steady flow hemodynamic studies that have shown the potential of tailoring the VAD outflow graft (VAD-OG) implantation in providing up to 50% reduction in embolization rates. In the current study, multi-scale pulsatile hemodynamics of the VAD bed is modeled and integrated in a fully automated multi-objective shape optimization scheme in which the VAD-OG anastomosis along the Ascending Aorta (AA) is optimized to minimize the objective function which include thromboembolic events to the cerebral vessels and wall shear stress (WSS). The model is driven by a time dependent pressure and flow boundary conditions located at the boundaries of the 3D domain through a 50 degree of freedom 0D lumped parameter model (LPM). The model includes a time dependent multi-scale Computational Fluid Dynamics (CFD) analysis of a patient specific geometry. Blood rheology is modeled as using the non-Newtonian Carreua-Yasuda model, while the hemodynamics are that of a laminar and constant density fluid. The pulsatile hemodynamics are resolved using the commercial CFD solver StarCCM+ while a Lagrangian particle tracking scheme is used to track constant density particles modeling thromobi released from the cannula to determine embolization rated of thrombi. The results show that cannula anastomosis orientation plays a large role when minimizing the objective function for patient derived aortic bed geometry used in this study. The scheme determined the optimal location of the cannula is located at 5.5 cm from the aortic root, cannula angle at 90 degrees and coronal angle at 8 degrees along the AA with a peak surface average WSS of 55.97 dy/cm2 and stroke percentile of 12.51%. A Pareto front was generated showing the range of 9.7% to 44.08% for stroke and WSS of 55.97 to 81.47 dy/cm2 ranged over 22 implantation configurations for the specific case studied. These results will further assist in the treatment planning for clinicians when implementing a LVAD.
Show less - Date Issued
- 2019
- Identifier
- CFE0007833, ucf:52827
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007833
- Title
- Life Cycle Sustainability Assessment Framework for the U.S. Built Environment.
- Creator
-
Kucukvar, Murat, Tatari, Mehmet, Oloufa, Amr, Behzadan, Amir, Al-Deek, Haitham, Pazour, Jennifer, University of Central Florida
- Abstract / Description
-
The overall goals of this dissertation are to investigate the sustainability of the built environment, holistically, by assessing its Triple Bottom Line (TBL): environmental, economic, and social impacts, as well as propose cost-effective, socially acceptable, and environmentally benign policies using several decision support models. This research is anticipated to transform life cycle assessment (LCA) of the built environment by using a TBL framework, integrated with economic input-output...
Show moreThe overall goals of this dissertation are to investigate the sustainability of the built environment, holistically, by assessing its Triple Bottom Line (TBL): environmental, economic, and social impacts, as well as propose cost-effective, socially acceptable, and environmentally benign policies using several decision support models. This research is anticipated to transform life cycle assessment (LCA) of the built environment by using a TBL framework, integrated with economic input-output analysis, simulation, and multi-criteria optimization tools. The major objectives of the outlined research are to (1) build a system-based TBL sustainability assessment framework for the sustainable built environment, by (a) advancing a national TBL-LCA model which is not available for the United States of America; (b) extending the integrated sustainability framework through environmental, economic, and social sustainability indicators; and (2) develop a system-based analysis toolbox for sustainable decisions including Monte Carlo simulation and multi-criteria compromise programming. When analyzing the total sustainability impacts by each U.S. construction sector, (")Residential Permanent Single and Multi-Family Structures" and "Other Non-residential Structures" are found to have the highest environmental, economic, and social impacts compared to other construction sectors. The analysis results also show that indirect suppliers of construction sectors have the largest sustainability impacts compared to on-site activities. For example, for all U.S. construction sectors, on-site construction processes are found to be responsible for less than 5 % of total water consumption, whereas about 95 % of total water use can be attributed to indirect suppliers. In addition, Scope 3 emissions are responsible for the highest carbon emissions compared to Scope 1 and 2. Therefore, using narrowly defined system boundaries by ignoring supply chain-related impacts can result in underestimation of TBL sustainability impacts of the U.S. construction industry.Residential buildings have higher shares in the most of the sustainability impact categories compared to other construction sectors. Analysis results revealed that construction phase, electricity use, and commuting played important role in much of the sustainability impact categories. Natural gas and electricity consumption accounted for 72% and 78% of the total energy consumed in the U.S. residential buildings. Also, the electricity use was the most dominant component of the environmental impacts with more than 50% of greenhouse gases emitted and energy used through all life stages. Furthermore, electricity generation was responsible for 60% of the total water withdrawal of residential buildings, which was even greater than the direct water consumption in residential buildings. In addition, construction phase had the largest share in income category with 60% of the total income generated through residential building's life cycle. Residential construction sector and its supply chain were responsible for 36% of the import, 40% of the gross operating surplus, and 50% of the gross domestic product. The most sensitive parameters were construction activities and its multiplier in most the sustainability impact categories.In addition, several emerging pavement types are analyzed using a hybrid TBL-LCA framework. Warm-mix Asphalts (WMAs) did not perform better in terms of environmental impacts compared to Hot-mix Asphalt (HMA). Asphamin(&)#174; WMA was found to have the highest environmental and socio-economic impacts compared to other pavement types. Material extractions and processing phase had the highest contribution to all environmental impact indicators that shows the importance of cleaner production strategies for pavement materials. Based on stochastic compromise programming results, in a balanced weighting situation, Sasobit(&)#174; WMA had the highest percentage of allocation (61%), while only socio-economic aspects matter, Asphamin(&)#174; WMA had the largest share (57%) among the WMA and HMA mixtures. The optimization results also supported the significance of an increased WMA use in the United States for sustainable pavement construction. Consequently, the outcomes of this dissertation will advance the state of the art in built environment sustainability research by investigating novel efficient methodologies capable of offering optimized policy recommendations by taking the TBL impacts of supply chain into account. It is expected that the results of this research would facilitate better sustainability decisions in the adoption of system-based TBL thinking in the construction field.
Show less - Date Issued
- 2013
- Identifier
- CFE0005018, ucf:50007
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005018
- Title
- IMPROVING AIRLINE SCHEDULE RELIABILITY USING A STRATEGIC MULTI-OBJECTIVE RUNWAY SLOT ASSIGNMENT SEARCH HEURISTIC.
- Creator
-
Hafner, Florian, Sepulveda, Alejandro, University of Central Florida
- Abstract / Description
-
Improving the predictability of airline schedules in the National Airspace System (NAS) has been a constant endeavor, particularly as system delays grow with ever-increasing demand. Airline schedules need to be resistant to perturbations in the system including Ground Delay Programs (GDPs) and inclement weather. The strategic search heuristic proposed in this dissertation significantly improves airline schedule reliability by assigning airport departure and arrival slots to each flight in the...
Show moreImproving the predictability of airline schedules in the National Airspace System (NAS) has been a constant endeavor, particularly as system delays grow with ever-increasing demand. Airline schedules need to be resistant to perturbations in the system including Ground Delay Programs (GDPs) and inclement weather. The strategic search heuristic proposed in this dissertation significantly improves airline schedule reliability by assigning airport departure and arrival slots to each flight in the schedule across a network of airports. This is performed using a multi-objective optimization approach that is primarily based on historical flight and taxi times but also includes certain airline, airport, and FAA priorities. The intent of this algorithm is to produce a more reliable, robust schedule that operates in today's environment as well as tomorrow's 4-Dimensional Trajectory Controlled system as described the FAA's Next Generation ATM system (NextGen). This novel airline schedule optimization approach is implemented using a multi-objective evolutionary algorithm which is capable of incorporating limited airport capacities. The core of the fitness function is an extensive database of historic operating times for flight and ground operations collected over a two year period based on ASDI and BTS data. Empirical distributions based on this data reflect the probability that flights encounter various flight and taxi times. The fitness function also adds the ability to define priorities for certain flights based on aircraft size, flight time, and airline usage. The algorithm is applied to airline schedules for two primary US airports: Chicago O'Hare and Atlanta Hartsfield-Jackson. The effects of this multi-objective schedule optimization are evaluated in a variety of scenarios including periods of high, medium, and low demand. The schedules generated by the optimization algorithm were evaluated using a simple queuing simulation model implemented in AnyLogic. The scenarios were simulated in AnyLogic using two basic setups: (1) using modes of flight and taxi times that reflect highly predictable 4-Dimensional Trajectory Control operations and (2) using full distributions of flight and taxi times reflecting current day operations. The simulation analysis showed significant improvements in reliability as measured by the mean square difference (MSD) of filed versus simulated flight arrival and departure times. Arrivals showed the most consistent improvements of up to 80% in on-time performance (OTP). Departures showed reduced overall improvements, particularly when the optimization was performed without the consideration of airport capacity. The 4-Dimensional Trajectory Control environment more than doubled the on-time performance of departures over the current day, more chaotic scenarios. This research shows that airline schedule reliability can be significantly improved over a network of airports using historical flight and taxi time data. It also provides for a mechanism to prioritize flights based on various airline, airport, and ATC goals. The algorithm is shown to work in today's environment as well as tomorrow's NextGen 4-Dimensional Trajectory Control setup.
Show less - Date Issued
- 2008
- Identifier
- CFE0002067, ucf:47572
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0002067