Current Search: Armacost, Robert (x)
View All Items
- Title
- DECISION-MAKER TRADE-OFFS IN MULTIPLE RESPONSE SURFACE OPTIMIZATION.
- Creator
-
Hawkins, Alicia, Armacost, Robert, University of Central Florida
- Abstract / Description
-
The focus of this dissertation is on improving decision-maker trade-offs and the development of a new constrained methodology for multiple response surface optimization. There are three key components of the research: development of the necessary conditions and assumptions associated with constrained multiple response surface optimization methodologies; development of a new constrained multiple response surface methodology; and demonstration of the new method. The necessary conditions for and...
Show moreThe focus of this dissertation is on improving decision-maker trade-offs and the development of a new constrained methodology for multiple response surface optimization. There are three key components of the research: development of the necessary conditions and assumptions associated with constrained multiple response surface optimization methodologies; development of a new constrained multiple response surface methodology; and demonstration of the new method. The necessary conditions for and assumptions associated with constrained multiple response surface optimization methods were identified and found to be less restrictive than requirements previously described in the literature. The conditions and assumptions required for a constrained method to find the most preferred non-dominated solution are to generate non-dominated solutions and to generate solutions consistent with decision-maker preferences among the response objectives. Additionally, if a Lagrangian constrained method is used, the preservation of convexity is required in order to be able to generate all non-dominated solutions. The conditions required for constrained methods are significantly fewer than those required for combined methods. Most of the existing constrained methodologies do not incorporate any provision for a decision-maker to explicitly determine the relative importance of the multiple objectives. Research into the larger area of multi-criteria decision-making identified the interactive surrogate worth trade-off algorithm as a potential methodology that would provide that capability in multiple response surface optimization problems. The ISWT algorithm uses an ε-constraint formulation to guarantee a non-dominated solution, and then interacts with the decision-maker after each iteration to determine the preference of the decision-maker in trading-off the value of the primary response for an increase in value of a secondary response. The current research modified the ISWT algorithm to develop a new constrained multiple response surface methodology that explicitly accounts for decision-maker preferences. The new Modified ISWT (MISWT) method maintains the essence of the original method while taking advantage of the specific properties of multiple response surface problems to simplify the application of the method. The MISWT is an accessible computer-based implementation of the ISWT. Five test problems from the multiple response surface optimization literature were used to demonstrate the new methodology. It was shown that this methodology can handle a variety of types and numbers of responses and independent variables. Furthermore, it was demonstrated that the methodology can be successful using a priori information from the decision-maker about bounds or targets or can use the extreme values obtained from the region of operability. In all cases, the methodology explicitly considered decision-maker preferences and provided non-dominated solutions. The contribution of this method is the removal of implicit assumptions and includes the decision-maker in explicit trade-offs among multiple objectives or responses.
Show less - Date Issued
- 2007
- Identifier
- CFE0001765, ucf:47272
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001765
- Title
- STOCHASTIC RESOURCE CONSTRAINED PROJECT SCHEDULING WITH STOCHASTIC TASK INSERTION PROBLEMS.
- Creator
-
Archer, Sandra, Armacost, Robert, University of Central Florida
- Abstract / Description
-
The area of focus for this research is the Stochastic Resource Constrained Project Scheduling Problem (SRCPSP) with Stochastic Task Insertion (STI). The STI problem is a specific form of the SRCPSP, which may be considered to be a cross between two types of problems in the general form: the Stochastic Project Scheduling Problem, and the Resource Constrained Project Scheduling Problem. The stochastic nature of this problem is in the occurrence/non-occurrence of tasks with deterministic...
Show moreThe area of focus for this research is the Stochastic Resource Constrained Project Scheduling Problem (SRCPSP) with Stochastic Task Insertion (STI). The STI problem is a specific form of the SRCPSP, which may be considered to be a cross between two types of problems in the general form: the Stochastic Project Scheduling Problem, and the Resource Constrained Project Scheduling Problem. The stochastic nature of this problem is in the occurrence/non-occurrence of tasks with deterministic duration. Researchers Selim (2002) and Grey (2007) laid the groundwork for the research on this problem. Selim (2002) developed a set of robustness metrics and used these to evaluate two initial baseline (predictive) scheduling techniques, optimistic (0% buffer) and pessimistic (100% buffer), where none or all of the stochastic tasks were scheduled, respectively. Grey (2007) expanded the research by developing a new partial buffering strategy for the initial baseline predictive schedule for this problem and found the partial buffering strategy to be superior to Selim's "extreme" buffering approach. The current research continues this work by focusing on resource aspects of the problem, new buffering approaches, and a new rescheduling method. If resource usage is important to project managers, then a set of metrics that describes changes to the resource flow would be important to measure between the initial baseline predictive schedule and the final "as-run" schedule. Two new sets of resource metrics were constructed regarding resource utilization and resource flow. Using these new metrics, as well as the Selim/Grey metrics, a new buffering approach was developed that used resource information to size the buffers. The resource-sized buffers did not show to have significant improvement over Grey's 50% buffer used as a benchmark. The new resource metrics were used to validate that the 50% buffering strategy is superior to the 0% or 100% buffering by Selim. Recognizing that partial buffers appear to be the most promising initial baseline development approach for STI problems, and understanding that experienced project managers may be able to predict stochastic probabilities based on prior projects, the next phase of the research developed a new set of buffering strategies where buffers are inserted that are proportional to the probability of occurrence. The results of this proportional buffering strategy were very positive, with the majority of the metrics (both robustness and resource), except for stability metrics, improved by using the proportional buffer. Finally, it was recognized that all research thus far for the SRCPSP with STI focused solely on the development of predictive schedules. Therefore, the final phase of this research developed a new reactive strategy that tested three different rescheduling points during schedule eventuation when a complete rescheduling of the latter portion of the schedule would occur. The results of this new reactive technique indicate that rescheduling improves the schedule performance in only a few metrics under very specific network characteristics (those networks with the least restrictive parameters). This research was conducted with extensive use of Base SAS v9.2 combined with SAS/OR procedures to solve project networks, solve resource flow problems, and implement reactive scheduling heuristics. Additionally, Base SAS code was paired with Visual Basic for Applications in Excel 2003 to implement an automated Gantt chart generator that provided visual inspection for validation of the repair heuristics. The results of this research when combined with the results of Selim and Grey provide strong guidance for project managers regarding how to develop baseline predictive schedules and how to reschedule the project as stochastic tasks (e.g. unplanned work) do or do not occur. Specifically, the results and recommendations are provided in a summary tabular format that describes the recommended initial baseline development approach if a project manager has a good idea of the level and location of the stochasticity for the network, highlights two cases where rescheduling during schedule eventuation may be beneficial, and shows when buffering proportional to the probability of occurrence is recommended, or not recommended, or the cases where the evidence is inconclusive.
Show less - Date Issued
- 2008
- Identifier
- CFE0002491, ucf:47673
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0002491
- Title
- BUFFER TECHNIQUES FOR STOCHASTIC RESOURCE CONSTRAINED PROJECT SCHEDULING WITH STOCHASTIC TASK INSERTIONS PROBLEMS.
- Creator
-
Grey, Jennifer, Armacost, Robert, University of Central Florida
- Abstract / Description
-
Project managers are faced with the challenging task of managing an environment filled with uncertainties that may lead to multiple disruptions during project execution. In particular, they are frequently confronted with planning for routine and non-routine unplanned work: known, identified, tasks that may or may not occur depending upon various, often unpredictable, factors. This problem is known as the stochastic task insertion problem, where tasks of deterministic duration occur...
Show moreProject managers are faced with the challenging task of managing an environment filled with uncertainties that may lead to multiple disruptions during project execution. In particular, they are frequently confronted with planning for routine and non-routine unplanned work: known, identified, tasks that may or may not occur depending upon various, often unpredictable, factors. This problem is known as the stochastic task insertion problem, where tasks of deterministic duration occur stochastically. Traditionally, project managers may include an extra margin within deterministic task times or an extra time buffer may be allotted at the end of the project schedule to protect the final project completion milestone. Little scientific guidance is available to better integrate buffers strategically into the project schedule. Motivated by the Critical Chain and Buffer Management approach of Goldratt, this research identifies, defines, and demonstrates new buffer sizing techniques to improve project duration and stability metrics associated with the stochastic resource constrained project scheduling problem with stochastic task insertions. Specifically, this research defines and compares partial buffer sizing strategies for projects with varying levels of resource and network complexity factors as well as the level and location of the stochastically occurring tasks. Several project metrics may be impacted by the stochastic occurrence or non-occurrence of a task such as the project makespan and the project stability. New duration and stability metrics are developed in this research and are used to evaluate the effectiveness of the proposed buffer sizing techniques. These "robustness measures" are computed through the comparison of the characteristics of the initial schedule (termed the infeasible base schedule), a modified base schedule (or as-run schedule) and an optimized version of the base schedule (or perfect knowledge schedule). Seven new buffer sizing techniques are introduced in this research. Three are based on a fixed percentage of task duration and the remaining four provide variable buffer sizes based upon the location of the stochastic task in the schedule and knowledge of the task stochasticity characteristic. Experimental analysis shows that partial buffering produces improvements in the project stability and duration metrics when compared to other baseline scheduling approaches. Three of the new partial buffering techniques produced improvements in project metrics. One of these partial buffers was based on a fixed percentage of task duration and the other two used a variable buffer size based on knowledge of the location of the task in the project network. This research provides project schedulers with new partial buffering techniques and recommendations for the type of partial buffering technique that should be utilized when project duration and stability performance improvements are desired. When a project scheduler can identify potential unplanned work and where it might occur, the use of these partial buffer techniques will yield a better estimated makespan. Furthermore, it will result in less disruption to the planned schedule and minimize the amount of time that specific tasks will have to move to accommodate the unplanned tasks.
Show less - Date Issued
- 2007
- Identifier
- CFE0001584, ucf:52850
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001584
- Title
- A Dynamic Enrollment Simulation Model for Planning and Decision-Making in a University.
- Creator
-
Robledo, Luis, Sepulveda, Jose, Kincaid, John, Armacost, Robert, Archer, Sandra, University of Central Florida
- Abstract / Description
-
Decision support systems for university management have had limited improvement in the incorporation of new cutting-edge techniques. Most decision-makers use traditional forecasting methods to base their decisions in order to maintain financially affordable programs and keep universities competitive for the last few decades. Strategic planning for universities has always been related to enrollment revenues, and operational expenses. Enrollment models in use today are able to represent...
Show moreDecision support systems for university management have had limited improvement in the incorporation of new cutting-edge techniques. Most decision-makers use traditional forecasting methods to base their decisions in order to maintain financially affordable programs and keep universities competitive for the last few decades. Strategic planning for universities has always been related to enrollment revenues, and operational expenses. Enrollment models in use today are able to represent forecasting based on historical data, considering usual variables like student headcount, student credit, among others. No consideration is given to students' preferences. Retention models, associated to enrollment, deal with average retention times leaving off preferences as well.Preferences play a major role at institutions where students are not required to declare their intentions (major) immediately. Even if they do, they may change it if they find another, more attractive major, or they may even decide to leave college for external reasons.Enrollment models have been identified to deal with three main purposes: prediction of income from tuition (in-state, out-of-state), planning of future courses and curriculum, and allocation of resources to academic departments, This general perspective does not provide useful information to faculty and Departments for detailed planning and allocation of resources for the next term or year. There is a need of new metrics to help faculty and Departments to reach a detailed and useful level in order to effectively plan this allocation of resources. The dynamics in the rate-of-growth, the preferences students have for certain majors at a specific point of time, or economic hardship make a difference when decisions have to be made for budgets requests, hiring of faculty, classroom assignment, parking, transportation, or even building new facilities. Existing models do not make difference between these variables.This simulation model is a hybrid model that considers the use of System Dynamics, Discrete-event and Agent-based simulation, which allows the representation of the general enrollment process at the University level (strategic decisions), and enrollment, retention and major selection at the College (tactical decisions) and Department level (operational decisions). This approach allows lower level to more accurately predict the number of students retained for next term or year, while allowing upper levels to decide on new students to admit (first time in college and transfers) and results in recommendations on faculty hiring, class or labs assignment, and resource allocation.This model merges both high and low levels of student's enrollment models into one application, allowing not only representation of the current overall enrollment, but also prediction at the College and Department level. This provides information on optimal classroom assignments, faculty and student resource allocation.
Show less - Date Issued
- 2013
- Identifier
- CFE0005055, ucf:49970
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005055
- Title
- Design of the layout of a manufacturing facility with a closed loop conveyor with shortcuts using queueing theory and genetic algorithms.
- Creator
-
Lasrado, Vernet, Nazzal, Dima, Mollaghasemi, Mansooreh, Reilly, Charles, Garibay, Ivan, Sivo, Stephen, Armacost, Robert, University of Central Florida
- Abstract / Description
-
With the ongoing technology battles and price wars in today's competitive economy, every company is looking for an advantage over its peers. A particular choice of facility layout can have a significant impact on the ability of a company to maintain lower operational expenses under uncertain economic conditions. It is known that systems with less congestion have lower operational costs. Traditionally, manufacturing facility layout problem methods aim at minimizing the total distance traveled,...
Show moreWith the ongoing technology battles and price wars in today's competitive economy, every company is looking for an advantage over its peers. A particular choice of facility layout can have a significant impact on the ability of a company to maintain lower operational expenses under uncertain economic conditions. It is known that systems with less congestion have lower operational costs. Traditionally, manufacturing facility layout problem methods aim at minimizing the total distance traveled, the material handling cost, or the time in the system (based on distance traveled at a specific speed). The proposed methodology solves the looped layout design problem for a looped layout manufacturing facility with a looped conveyor material handling system with shortcuts using a system performance metric, i.e. the work in process (WIP) on the conveyor and at the input stations to the conveyor, as a factor in the minimizing function for the facility layout optimization problem which is solved heuristically using a permutation genetic algorithm. The proposed methodology also presents the case for determining the shortcut locations across the conveyor simultaneously (while determining the layout of the stations around the loop) versus the traditional method which determines the shortcuts sequentially (after the layout of the stations has been determined). The proposed methodology also presents an analytical estimate for the work in process at the input stations to the closed looped conveyor.It is contended that the proposed methodology (using the WIP as a factor in the minimizing function for the facility layout while simultaneously solving for the shortcuts) will yield a facility layout which is less congested than a facility layout generated by the traditional methods (using the total distance traveled as a factor of the minimizing function for the facility layout while sequentially solving for the shortcuts). The proposed methodology is tested on a virtual 300mm Semiconductor Wafer Fabrication Facility with a looped conveyor material handling system with shortcuts. The results show that the facility layouts generated by the proposed methodology have significantly less congestion than facility layouts generated by traditional methods. The validation of the developed analytical estimate of the work in process at the input stations reveals that the proposed methodology works extremely well for systems with Markovian Arrival Processes.
Show less - Date Issued
- 2011
- Identifier
- CFE0004125, ucf:49088
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004125
- Title
- Defining a Stakeholder-Relative Model to Measure Academic Department Efficiency at Achieving Quality in Higher Education.
- Creator
-
Robinson, Federica, Sepulveda, Jose, Reilly, Charles, Nazzal, Dima, Armacost, Robert, Feldheim, Mary, University of Central Florida
- Abstract / Description
-
In a time of strained resources and dynamic environments, the importance of effective and efficient systems is critical. This dissertation was developed to address the need to use feedback from multiple stakeholder groups to define quality and assess an entity's efficiency at achieving such quality.A decision support model with applicability to diverse domains was introduced to outline the approach. Three phases, (1) quality model development, (2) input-output selection and (3) relative...
Show moreIn a time of strained resources and dynamic environments, the importance of effective and efficient systems is critical. This dissertation was developed to address the need to use feedback from multiple stakeholder groups to define quality and assess an entity's efficiency at achieving such quality.A decision support model with applicability to diverse domains was introduced to outline the approach. Three phases, (1) quality model development, (2) input-output selection and (3) relative efficiency assessment, captured the essence of the process which also delineates the approach per tool applied. This decision support model was adapted in higher education to assess academic departmental efficiency at achieving stakeholder-relative quality. Phase 1 was accomplished through a three round, Delphi-like study which involved user group refinement. Those results were compared to the criteria of an engineering accreditation body (ABET) to support the model's validity to capture quality in the College of Engineering (&) Computer Science, its departments and programs. In Phase 2 the Analytic Hierarchy Process (AHP) was applied to the validated model to quantify the perspective of students, administrators, faculty and employers (SAFE). Using the composite preferences for the collective group (n=74), the model was limited to the top 7 attributes which accounted for about 55% of total preferences. Data corresponding to the resulting variables, referred to as key performance indicators, was collected using various information sources and infused in the data envelopment analysis (DEA) methodology (Phase 3). This process revealed both efficient and inefficient departments while offering transparency of opportunities to maximize quality outputs. Findings validate the potential of the Delphi-like, analytic hierarchical, data envelopment analysis approach for administrative decision-making in higher education. However, the availability of more meaningful metrics and data is required to adapt the model for decision making purposes. Several recommendations were included to improve the usability of the decision support model and future research opportunities were identified to extend the analyses inherent and apply the model to alternative areas.
Show less - Date Issued
- 2013
- Identifier
- CFE0004921, ucf:49636
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004921