Current Search: Mollaghasemi, Mansooreh (x)
View All Items
- Title
- Integration of artificial neural networks and simulation modeling in a decision support system.
- Creator
-
LeCroy, Kenney, Mollaghasemi, Mansooreh, Engineering
- Abstract / Description
-
University of Central Florida College of Engineering Thesis; A simulation based decision support system is developed for AT[and]T Microelectronics in Orlando. This system uses simulation modeling to capture the complex nature of semiconductor test operations. Simulation, however, is not a tool for optimizations by itself. Numerous executions of the simulation model must generally be performed to narrow in on a set of proper decision parameters. As a means of alleviating this shortcoming,...
Show moreUniversity of Central Florida College of Engineering Thesis; A simulation based decision support system is developed for AT[and]T Microelectronics in Orlando. This system uses simulation modeling to capture the complex nature of semiconductor test operations. Simulation, however, is not a tool for optimizations by itself. Numerous executions of the simulation model must generally be performed to narrow in on a set of proper decision parameters. As a means of alleviating this shortcoming, artificial neural networks are used in conjunction with simulation modeling to aid management in the decision making process. The integration of simulation and neural networks in a comprehensive decision support system, in effect, learns the reverse of the simulation process. That is, given a set of goals defined for performance measures, the decision support system suggests proper values for decision parameters to achieve those goals.
Show less - Date Issued
- 1994
- Identifier
- CFR0011935, ucf:53114
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFR0011935
- Title
- AN APPROACH TO AUTOMATING DATA COLLECTION FOR SIMULATION.
- Creator
-
Portnaya, Irin, Mollaghasemi, Mansooreh, University of Central Florida
- Abstract / Description
-
In past years many industries have utilized simulation as a means for decision making. That wave has introduced simulation as a powerful optimization and development tool in the manufacturing industry. Input data collection is a significant and complex event in the process of simulation. The simulation professionals have grown to accept it is as a strenuous but necessary task. Due to the nature of this task, data collection problems are numerous and vary depending on the situation. These...
Show moreIn past years many industries have utilized simulation as a means for decision making. That wave has introduced simulation as a powerful optimization and development tool in the manufacturing industry. Input data collection is a significant and complex event in the process of simulation. The simulation professionals have grown to accept it is as a strenuous but necessary task. Due to the nature of this task, data collection problems are numerous and vary depending on the situation. These problems may involve time consumption, lack of data, lack of structure, etc. This study concentrates on the challenges of input data collection for Discrete Event Simulation in the manufacturing industry and focuses particularly on speed, efficiency, data completeness and data accuracy. It has been observed that many companies have recently utilized commercial databases to store production data. This study proposes that the key to faster and more efficient input data collection is to extract data directly from these sources in a flexible and efficient way. An approach is introduced here to creating a custom software tool for a manufacturing setting that allows input data to be collected and formatted quickly and accurately.The methodology for the development of such a custom tool and its implementation, Part Data Collection, are laid out in this research. The Part Data Collection application was developed to assist in the simulation endeavors of Lockheed Martin Missiles and Fire Control in Orlando, Florida. It was implemented and tested as an aid in a large simulation project, which included modeling a new factory. This implementation resulted in 93% reduction in labor time associated with data collection and significantly improved data accuracy.
Show less - Date Issued
- 2004
- Identifier
- CFE0000025, ucf:46096
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000025
- Title
- A MULTI-VIEW FRAMEWORK FOR DEFINING THE SERVICES SUPPLY CHAIN USING OBJECT ORIENTED METHODOLOGY.
- Creator
-
Barnard, James, Mollaghasemi, Mansooreh, University of Central Florida
- Abstract / Description
-
Supply-chain management is the practice combining theory from logistics, operations management, production management and inventory control. Therefore, it is often associated exclusively with manufacturing or materials management industries. Application of supply-chain management to other industries often results in implementations that do not satisfy the needs of the involved enterprises. To improve the implementation of supply-chain solutions outside of the materials management and...
Show moreSupply-chain management is the practice combining theory from logistics, operations management, production management and inventory control. Therefore, it is often associated exclusively with manufacturing or materials management industries. Application of supply-chain management to other industries often results in implementations that do not satisfy the needs of the involved enterprises. To improve the implementation of supply-chain solutions outside of the materials management and manufacturing industries there is a need for industry specific standards. One industry sector in need of a standard is the services industry. The current problem facing the services sector is the inability to adapt current frameworks to the provisioning of a service. Provisioning a service translates into the supply-chain for the services industry since it influences the services supply and demand. A solution to the problem is development of a supply-chain standard specific to the provisioning of a service. Objectives of the research are to define comprehensively, a new services supply-chain model that is applicable to the United States government classification of a service and to ensure the scalability and integration capability of the model. To satisfy these objectives, it is necessary to understand the characteristics describing the services supply-chain process. The characteristics are the input into deriving the processes and terminology of the generalized services supply-chain. Terminology and processes are then used to create a supply-chain framework using input from the Supply-Chain Council's Supply-Chain Operations Reference (SCOR) model. SCOR provides a foundation for describing the processes and defining the terminology in an already accepted format. A final verification of the model by industry experts insures conceptually that the framework is applicable to the current problem. This research developed a three-level framework similar in structure to the SCOR framework. Presentation of the framework is a specification that defines and sequences the processes for implementation. A detailed case study applies the model using the framework and the definition of a comprehensive supply-chain.
Show less - Date Issued
- 2006
- Identifier
- CFE0001485, ucf:47097
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001485
- Title
- IMPROVING PROJECT MANAGEMENT WITH SIMULATION AND COMPLETION DISTRIBUTION FUNCTIONS.
- Creator
-
Cates, Grant, Mollaghasemi, Mansooreh, University of Central Florida
- Abstract / Description
-
Despite the critical importance of project completion timeliness, management practices in place today remain inadequate for addressing the persistent problem of project completion tardiness. Uncertainty has been identified as a contributing factor in late projects. This uncertainty resides in activity duration estimates, unplanned upsetting events, and the potential unavailability of critical resources. This research developed a comprehensive simulation based methodology for conducting...
Show moreDespite the critical importance of project completion timeliness, management practices in place today remain inadequate for addressing the persistent problem of project completion tardiness. Uncertainty has been identified as a contributing factor in late projects. This uncertainty resides in activity duration estimates, unplanned upsetting events, and the potential unavailability of critical resources. This research developed a comprehensive simulation based methodology for conducting quantitative project completion-time risk assessments. The methodology enables project stakeholders to visualize uncertainty or risk, i.e. the likelihood of their project completing late and the magnitude of the lateness, by providing them with a completion time distribution function of their projects. Discrete event simulation is used to determine a project's completion distribution function. The project simulation is populated with both deterministic and stochastic elements. Deterministic inputs include planned activities and resource requirements. Stochastic inputs include activity duration growth distributions, probabilities for unplanned upsetting events, and other dynamic constraints upon project activities. Stochastic inputs are based upon past data from similar projects. The time for an entity to complete the simulation network, subject to both the deterministic and stochastic factors, represents the time to complete the project. Multiple replications of the simulation are run to create the completion distribution function. The methodology was demonstrated to be effective for the on-going project to assemble the International Space Station. Approximately $500 million per month is being spent on this project, which is scheduled to complete by 2010. Project stakeholders participated in determining and managing completion distribution functions. The first result was improved project completion risk awareness. Secondly, mitigation options were analyzed to improve project completion performance and reduce total project cost.
Show less - Date Issued
- 2004
- Identifier
- CFE0000209, ucf:46243
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000209
- Title
- AN AUTOMATED METHODOLOGY FOR A COMPREHENSIVE DEFINITION OF THE SUPPLY CHAIN USING GENERIC ONTOLOGICAL COMPONENTS.
- Creator
-
Fayez, Mohamed, Mollaghasemi, Mansooreh, University of Central Florida
- Abstract / Description
-
Today, worldwide business communities are in the era of the Supply Chains. A Supply Chain is a collection of several independent enterprises that partner together to achieve specific goals. These enterprises may plan, source, produce, deliver, or transport materials to satisfy an immediate or projected market demand, and may provide the after sales support, warranty services, and returns. Each enterprise in the Supply Chain has roles and elements. The roles include supplier, customer, or...
Show moreToday, worldwide business communities are in the era of the Supply Chains. A Supply Chain is a collection of several independent enterprises that partner together to achieve specific goals. These enterprises may plan, source, produce, deliver, or transport materials to satisfy an immediate or projected market demand, and may provide the after sales support, warranty services, and returns. Each enterprise in the Supply Chain has roles and elements. The roles include supplier, customer, or carrier and the elements include functional units, processes, information, information resources, materials, objects, decisions, practices, and performance measures. Each enterprise, individually, manages these elements in addition to their flows, their interdependencies, and their complex interactions. Since a Supply Chain brings several enterprises together to complement each other to achieve a unified goal, the elements in each enterprise have to complement each other and have to be managed together as one unit to achieve the unified goal efficiently. Moreover, since there are a large number of elements to be defined and managed in a single enterprise, then the number of elements to be defined and managed when considering the whole Supply Chain is massive. The supply chain community is using the Supply Chain Operations Reference model (SCOR model) to define their supply chains. However, the SCOR model methodology is limited in defining the supply chain. The SCOR model defines the supply chain in terms of processes, performance metrics, and best practices. In fact, the supply chain community, SCOR users in particular, exerts massive effort to render an adequate supply chain definition that includes the other elements besides the elements covered in the SCOR model. Also, the SCOR model is delivered to the user in a document, which puts a tremendous burden on the user to use the model and makes it difficult to share the definition within the enterprise or across the supply chain. This research is directed towards overcoming the limitations and shortcomings of the current supply chain definition methodology. This research proposes a methodology and a tool that will enable an automated and comprehensive definition of the Supply Chain at any level of details. The proposed comprehensive definition methodology captures all the constituent parts of the Supply Chain at four different levels which are, the supply chain level, the enterprise level, the elements level, and the interaction level. At the Supply Chain level, the various enterprises that constitute the supply chain are defined. At the enterprise level, the enterprise elements are identified. At the enterprises' elements level, each element in the enterprise is explicitly defined. At the interaction level, the flows, interdependence, and interactions that exist between and within the other three levels are identified and defined. The methodology utilized several modeling techniques to generate generic explicit views and models that represents the four levels. The developed views and models were transformed to a series of questions and answers, where the questions correspond to what a view provides and the answers are the knowledge captured and generated from the view. The questions and answers were integrated to render a generic multi-view of the supply chain. The methodology and the multi-view were implemented in an ontology-based tool. The ontology includes sets of generic supply chain ontological components that represent the supply chain elements and a set of automated procedures that can be utilized to define a specific supply chain. A specific supply chain can be defined by re-using the generic components and customizing them to the supply chain specifics. The ontology-based tool was developed to function in the supply chain dynamic, information intensive, geographically dispersed, and heterogeneous environment. To that end, the tool was developed to be generic, sharable, automated, customizable, extensible, and scalable.
Show less - Date Issued
- 2005
- Identifier
- CFE0000399, ucf:46324
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000399
- Title
- A Production and Cost Modeling Methodology of 2nd Generation Biofuel in the United States.
- Creator
-
Poole, David, Kincaid, John, Mollaghasemi, Mansooreh, Geiger, Christopher, University of Central Florida
- Abstract / Description
-
The use of biofuels in the United States has increased dramatically in the last few years. The largest source of feedstock for ethanol to date has been corn. However, corn is also a vitally important food crop and is used commonly as feed for cattle and other livestock. To prevent further diversion of an important food crop to production of ethanol, there is great interest in developing commercial-scale technologies to make ethanol from non-food crops, or other suitable plant material. This...
Show moreThe use of biofuels in the United States has increased dramatically in the last few years. The largest source of feedstock for ethanol to date has been corn. However, corn is also a vitally important food crop and is used commonly as feed for cattle and other livestock. To prevent further diversion of an important food crop to production of ethanol, there is great interest in developing commercial-scale technologies to make ethanol from non-food crops, or other suitable plant material. This is commonly referred to as biomass. A review is made of lignocellulosic sources being considered as feedstocks to produce ethanol. Current technologies for pretreatment and hydrolysis of the biomass material are examined and discussed. Production data and cost estimates are culled from the literature, and used to assist in development of mathematical models for evaluation of production ramp-up profiles, and cost estimation. These mathematical models are useful as a planning tool, and provide a methodology to estimate monthly production output and costs for labor, capital, operations and maintenance, feedstock, raw materials, and total cost. Existing credits for ethanol production are also considered and modeled. The production output in liters is modeled as a negative exponential growth curve, with a rate coefficient providing the ability to evaluate slower, or faster, growth in production output and its corresponding effect on monthly cost. The capital and labor costs per unit of product are determined by dividing the monthly debt service and labor costs by that month?s production value. The remaining cost components change at a constant rate in the simulation case studies. This methodology is used to calculate production levels and costs as a function of time for a 25 million gallon per year capacity cellulosic ethanol plant. The parameters of interest are calculated in MATLAB with a deterministic, continuous system simulation model. Simulation results for high, medium, and low cost case studies are included. Assumptions for the model and for each case study are included and some comparisons are made to cost estimates in the literature. While the cost per unit of product decreases and production output increases over time, some reasonable cost values are obtained by the end of the second year for both the low and medium cost case studies. By the end of Year 2, total costs for those case studies are $0.48 per liter and $0.88 per liter, respectively. These cost estimates are well within the reported range of values from the reviewed literature sources. Differing assumptions for calculations made by different sources make a direct cost comparison with the outputs of this modeling methodology extremely difficult. Proposals for reducing costs are introduced. Limitations and shortcomings of the research activity are discussed, along with recommendations for potential future work in improving the simulation model and model verification activities. In summary, the author was not able to find evidence?within the public domain?of any similar modeling and simulation methodology that uses a deterministic, continuous simulation model to evaluate production and costs as a function of time. This methodology is also unique in highlighting the important effect of production ramp-up on monthly costs for capital (debt service) and labor. The resultant simulation model can be used for planning purposes and provides an independent, unbiased estimate of cost as a function of time.
Show less - Date Issued
- 2012
- Identifier
- CFE0004424, ucf:49321
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004424
- Title
- Critical Success Factors for Evolutionary Acquisition Implementation.
- Creator
-
Bjorn, Brig, Kotnour, Timothy, Karwowski, Waldemar, Mollaghasemi, Mansooreh, Farr, John, University of Central Florida
- Abstract / Description
-
Due to extensive challenges to the efficient development and fielding of operationally effective and affordable weapon systems, the U.S. employs a complex management framework to govern defense acquisition programs. The Department of Defense and Congress recently modified this process to improve the levels of knowledge available at key decision points in order to reduce lifecycle cost, schedule, and technical risk to programs. This exploratory research study employed multiple methods to...
Show moreDue to extensive challenges to the efficient development and fielding of operationally effective and affordable weapon systems, the U.S. employs a complex management framework to govern defense acquisition programs. The Department of Defense and Congress recently modified this process to improve the levels of knowledge available at key decision points in order to reduce lifecycle cost, schedule, and technical risk to programs. This exploratory research study employed multiple methods to examine the impact of systems engineering reviews, competitive prototyping, and the application of a Modular Open Systems Approach on knowledge and risk prior to funding system implementation and production. In-depth case studies of two recent Major Defense Acquisition Programs were conducted to verify the existence and relationships of the proposed constructs and identify potential barriers to program success introduced by the new process. The case studies included program documentation analysis as well as interviews with contractor personnel holding multiple roles on the program. A questionnaire-based survey of contractor personnel from a larger set of programs was executed to test the case study findings against a larger data set. The study results indicate that while some changes adversely affected program risk levels, the recent modifications to the acquisition process generally had a positive impact on levels of critical knowledge at the key Milestone B decision point. Based on the results of this study it is recommended that the Government improve its ability to communicate with contractors during competitive phases, particularly with regard to requirements management, and establish verifiable criteria for compliance with theModular Open Systems Approach. Additionally, the Government should clarify the intent of competitive prototyping and develop a strategy to better manage the inevitable gaps between program phases. Contractors are recommended to present more requirements trade-offs and focus less on prototype development during the Technology Development phases of programs. The results of this study may be used by policy makers to shape future acquisition reforms; by Government personnel to improve the implementation of the current regulations; and by contractors to shape strategies and processes for more effective system development. This research may be used by the Government to improve the execution of acquisition programs under this new paradigm. The defense industrial base can use this research to better understand the impacts of the new process and improve strategic planning processes. The research methodology may be applied to new and different types of programs to assess improvement in the execution process over time.
Show less - Date Issued
- 2012
- Identifier
- CFE0004358, ucf:49442
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004358
- Title
- A Posteriori and Interactive Approaches for Decision-Making with Multiple Stochastic Objectives.
- Creator
-
Bakhsh, Ahmed, Geiger, Christopher, Mollaghasemi, Mansooreh, Xanthopoulos, Petros, Wiegand, Rudolf, University of Central Florida
- Abstract / Description
-
Computer simulation is a popular method that is often used as a decision support tool in industry to estimate the performance of systems too complex for analytical solutions. It is a tool that assists decision-makers to improve organizational performance and achieve performance objectives in which simulated conditions can be randomly varied so that critical situations can be investigated without real-world risk. Due to the stochastic nature of many of the input process variables in simulation...
Show moreComputer simulation is a popular method that is often used as a decision support tool in industry to estimate the performance of systems too complex for analytical solutions. It is a tool that assists decision-makers to improve organizational performance and achieve performance objectives in which simulated conditions can be randomly varied so that critical situations can be investigated without real-world risk. Due to the stochastic nature of many of the input process variables in simulation models, the output from the simulation model experiments are random. Thus, experimental runs of computer simulations yield only estimates of the values of performance objectives, where these estimates are themselves random variables.Most real-world decisions involve the simultaneous optimization of multiple, and often conflicting, objectives. Researchers and practitioners use various approaches to solve these multiobjective problems. Many of the approaches that integrate the simulation models with stochastic multiple objective optimization algorithms have been proposed, many of which use the Pareto-based approaches that generate a finite set of compromise, or tradeoff, solutions. Nevertheless, identification of the most preferred solution can be a daunting task to the decision-maker and is an order of magnitude harder in the presence of stochastic objectives. However, to the best of this researcher's knowledge, there has been no focused efforts and existing work that attempts to reduce the number of tradeoff solutions while considering the stochastic nature of a set of objective functions.In this research, two approaches that consider multiple stochastic objectives when reducing the set of the tradeoff solutions are designed and proposed. The first proposed approach is an a posteriori approach, which uses a given set of Pareto optima as input. The second approach is an interactive-based approach that articulates decision-maker preferences during the optimization process. A detailed description of both approaches is given, and computational studies are conducted to evaluate the efficacy of the two approaches. The computational results show the promise of the proposed approaches, in that each approach effectively reduces the set of compromise solutions to a reasonably manageable size for the decision-maker. This is a significant step beyond current applications of decision-making process in the presence of multiple stochastic objectives and should serve as an effective approach to support decision-making under uncertainty.
Show less - Date Issued
- 2013
- Identifier
- CFE0004973, ucf:49574
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004973
- Title
- Shop Scheduling in the Presence of Batching, Sequence-Dependent Setups and Incompatible Job Families Minimizing Earliness and Tardiness Penalties.
- Creator
-
Buchanan, Patricia, Geiger, Christopher, Mollaghasemi, Mansooreh, Pazour, Jennifer, Nazzal, Dima, University of Central Florida
- Abstract / Description
-
The motivation of this research investigation stems from a particular job shop production environment at a large international communications and information technology company in which electro-mechanical assemblies (EMAs) are produced. The production environment of the EMAs includes the continuous arrivals of the EMAs (generally called jobs), with distinct due dates, degrees of importance and routing sequences through the production workstations, to the job shop. Jobs are processed in...
Show moreThe motivation of this research investigation stems from a particular job shop production environment at a large international communications and information technology company in which electro-mechanical assemblies (EMAs) are produced. The production environment of the EMAs includes the continuous arrivals of the EMAs (generally called jobs), with distinct due dates, degrees of importance and routing sequences through the production workstations, to the job shop. Jobs are processed in batches at the workstations, and there are incompatible families of jobs, where jobs from different product families cannot be processed together in the same batch. In addition, there are sequence-dependent setups between batches at the workstations. Most importantly, it is imperative that all product deliveries arrive on time to their customers (internal and external) within their respective delivery time windows. Delivery is allowed outside a time window, but at the expense of a penalty. Completing a job and delivering the job before the start of its respective time window results in a penalty, i.e., inventory holding cost. Delivering a job after its respective time window also results in a penalty, i.e., delay cost or emergency shipping cost. This presents a unique scheduling problem where an earliness-tardiness composite objective is considered.This research approaches this scheduling problem by decomposing this complex job shop scheduling environment into bottleneck and non-bottleneck resources, with the primary focus on effectively scheduling the bottleneck resource. Specifically, the problem of scheduling jobs with unique due dates on a single workstation under the conditions of batching, sequence-dependent setups, incompatible job families in order to minimize weighted earliness and tardiness is formulated as an integer linear program. This scheduling problem, even in its simplest form, is NP-Hard, where no polynomial-time algorithm exists to solve this problem to optimality, especially as the number of jobs increases. As a result, the computational time to arrive at optimal solutions is not of practical use in industrial settings, where production scheduling decisions need to be made quickly. Therefore, this research explores and proposes new heuristic algorithms to solve this unique scheduling problem. The heuristics use order review and release strategies in combination with priority dispatching rules, which is a popular and more commonly-used class of scheduling algorithms in real-world industrial settings. A computational study is conducted to assess the quality of the solutions generated by the proposed heuristics. The computational results show that, in general, the proposed heuristics produce solutions that are competitive to the optimal solutions, yet in a fraction of the time. The results also show that the proposed heuristics are superior in quality to a set of benchmark algorithms within this same class of heuristics.
Show less - Date Issued
- 2014
- Identifier
- CFE0005139, ucf:50717
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005139
- Title
- A Holistic Framework for Transitional Management.
- Creator
-
Elattar, Ahmed, Rabelo, Luis, Pazour, Jennifer, Mollaghasemi, Mansooreh, Ajayi, Richard, University of Central Florida
- Abstract / Description
-
For all business organizations, there comes a time when a change must take place within their eco-system. It consumes a great deal of thought and planning to ensure that the right decision is made as it could alter the entire course of their business for a number of years to come. This change may appear in the form of a brilliant CEO reaching the age of retirement, or an unsuccessful Managing Director being asked to leave before fulfilling the term of her contract. Regardless of the cause, a...
Show moreFor all business organizations, there comes a time when a change must take place within their eco-system. It consumes a great deal of thought and planning to ensure that the right decision is made as it could alter the entire course of their business for a number of years to come. This change may appear in the form of a brilliant CEO reaching the age of retirement, or an unsuccessful Managing Director being asked to leave before fulfilling the term of her contract. Regardless of the cause, a transition must occur in which a suitable successor is chosen and put into place while minimizing costs, satisfying stakeholders, ensuring that the successor has been adequately prepared for their new position, and minimizing work place gossip, among other things. It is also important to understand how the nature of the business, as well as its financial standing, effects such a transition.Engineering and management principles come together in this study to ensure that organizations going through such a change are on the right course. As the problem of transitional management is not one of concrete values and contains many ambiguous concepts, one way to tackle the problem is by utilizing various industrial engineering methodologies that allow these companies to systematically begin preparing for such a change. By default, organizational strategy has to change, technology is continually being renewed and it becomes very hard for the same leader to constantly implement new and innovative developments.Organizations today have a very poor understanding of where they currently stand and as a result the cause for a company's lack of profitability is often overlooked with time and money being wasted in an attempt to fix something that is not broken. To be able to look at the bigger picture of an organization and from there begin to close in on the main problems causing a negative impact, the Matrix of Change is used and takes in many factors to layout an accurate representation of the direction in which an organization should be headed and how it can continue to grow and remain successful. The Theory of Constraints on the other hand is used here as a step-by-step guide allowing companies to be better organized during times of change. And System Dynamics modeling is where these companies can begin to simulate and solve the dilemma of transitional management using causal loop diagrams and stock and flow diagrams.Through such tools a framework can begin to be developed, one that is valued by corporations and continually reviewed. Several case studies, simulation modeling, and a panel of experts were used in order to demonstrate and validate this framework.
Show less - Date Issued
- 2014
- Identifier
- CFE0005160, ucf:50708
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005160
- Title
- A Framework for Quantifying Sustainability of Lean Implementation in Healthcare Organizations.
- Creator
-
Bahaitham, Haitham, Elshennawy, Ahmad, Mollaghasemi, Mansooreh, Lee, Gene, Uddin, Nizam, Furterer, Sandra, University of Central Florida
- Abstract / Description
-
Due to the remarkable positive effect of lean adoption in various firms in the manufacturing sector, it has been adopted by several organizations within the healthcare industry. Although the rate of adopting lean by hospitals in the developed countries is slower than it should be, it proved to be effective in helping healthcare organizations maintain or even improve their quality of care while containing their related costs. However, such adoption did not take place until the beginning of the...
Show moreDue to the remarkable positive effect of lean adoption in various firms in the manufacturing sector, it has been adopted by several organizations within the healthcare industry. Although the rate of adopting lean by hospitals in the developed countries is slower than it should be, it proved to be effective in helping healthcare organizations maintain or even improve their quality of care while containing their related costs. However, such adoption did not take place until the beginning of the new millennium. And with such adoption, it has been accompanied with major challenges related to proper lean implementation, sustainability of achieved levels of performance, and staff engagement in infinite cycles of continuous improvement towards perfection. Thus, the purpose of this study is to develop a framework that helps healthcare organizations quantify their experience with lean. Such quantification is obtained by measuring the agreement level of hospital staff members about the degree of adopting two sets of critical factors of successful lean implementation within their hospital. These two sets of factors are classified as process factors and organizational factors. The proposed framework has been validated by determining the sustainability level of lean implementation within one of U.S. hospitals in the State of Florida. The developed framework provides a balanced assessment of both process and organizational factors essential for achieving sustainable levels of lean implementation. In order to accommodate for the observed variation in lean adoption in hospitals, individual hospital departments are considered the (")analysis units(") of the developed framework. In order to quantify the implementation status of lean within a hospital department, a survey-based lean sustainability assessment tool has been developed based on the defined sets of factors. The sustainability level of lean implementation of a hospital can be obtained by combining various responses of its surveyed departments. The developed framework is the first that addresses both process and organizational factors of sustainable lean implementation in a balanced manner while fulfilling the assessment needs of all healthcare organizations regardless of their current level of lean adoption. In addition, utilizing the framework within a hospital enhances employee involvement and respect for employee which are essential for sustainable lean implementation. Finally, the developed framework provides healthcare supervising authorities (i.e. ministries of health or corporate offices of hospitals' groups) a macro-level benchmarking view regarding the progress of their hospitals towards implementing sustainable levels of lean.
Show less - Date Issued
- 2011
- Identifier
- CFE0004086, ucf:49140
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004086
- Title
- Design of the layout of a manufacturing facility with a closed loop conveyor with shortcuts using queueing theory and genetic algorithms.
- Creator
-
Lasrado, Vernet, Nazzal, Dima, Mollaghasemi, Mansooreh, Reilly, Charles, Garibay, Ivan, Sivo, Stephen, Armacost, Robert, University of Central Florida
- Abstract / Description
-
With the ongoing technology battles and price wars in today's competitive economy, every company is looking for an advantage over its peers. A particular choice of facility layout can have a significant impact on the ability of a company to maintain lower operational expenses under uncertain economic conditions. It is known that systems with less congestion have lower operational costs. Traditionally, manufacturing facility layout problem methods aim at minimizing the total distance traveled,...
Show moreWith the ongoing technology battles and price wars in today's competitive economy, every company is looking for an advantage over its peers. A particular choice of facility layout can have a significant impact on the ability of a company to maintain lower operational expenses under uncertain economic conditions. It is known that systems with less congestion have lower operational costs. Traditionally, manufacturing facility layout problem methods aim at minimizing the total distance traveled, the material handling cost, or the time in the system (based on distance traveled at a specific speed). The proposed methodology solves the looped layout design problem for a looped layout manufacturing facility with a looped conveyor material handling system with shortcuts using a system performance metric, i.e. the work in process (WIP) on the conveyor and at the input stations to the conveyor, as a factor in the minimizing function for the facility layout optimization problem which is solved heuristically using a permutation genetic algorithm. The proposed methodology also presents the case for determining the shortcut locations across the conveyor simultaneously (while determining the layout of the stations around the loop) versus the traditional method which determines the shortcuts sequentially (after the layout of the stations has been determined). The proposed methodology also presents an analytical estimate for the work in process at the input stations to the closed looped conveyor.It is contended that the proposed methodology (using the WIP as a factor in the minimizing function for the facility layout while simultaneously solving for the shortcuts) will yield a facility layout which is less congested than a facility layout generated by the traditional methods (using the total distance traveled as a factor of the minimizing function for the facility layout while sequentially solving for the shortcuts). The proposed methodology is tested on a virtual 300mm Semiconductor Wafer Fabrication Facility with a looped conveyor material handling system with shortcuts. The results show that the facility layouts generated by the proposed methodology have significantly less congestion than facility layouts generated by traditional methods. The validation of the developed analytical estimate of the work in process at the input stations reveals that the proposed methodology works extremely well for systems with Markovian Arrival Processes.
Show less - Date Issued
- 2011
- Identifier
- CFE0004125, ucf:49088
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004125
- Title
- Successful Organizational Change: Aligning Change Type with Methods.
- Creator
-
Al-Haddad, Serina, Kotnour, Timothy, Mollaghasemi, Mansooreh, Hoekstra, Robert, Diaz, Rey, University of Central Florida
- Abstract / Description
-
The motivation behind this research is the prevalence of challenges and ambiguity associated with successful organizational change and the numerous available approaches in dealing with these challenges and ambiguity.Many definitions and methods have been suggested to manage change; however, organizations still report a high failure rate of their change initiatives. These high failure rates highlight the continuing need for research and investigation, and imply a lack of a valid framework for...
Show moreThe motivation behind this research is the prevalence of challenges and ambiguity associated with successful organizational change and the numerous available approaches in dealing with these challenges and ambiguity.Many definitions and methods have been suggested to manage change; however, organizations still report a high failure rate of their change initiatives. These high failure rates highlight the continuing need for research and investigation, and imply a lack of a valid framework for managing successful organizational change. This dissertation critically reviews the concept of having one change approach as the (")silver-bullet("). In pursuit of this goal, this research contributes a roadmap to the change management literature and provides definitions for describing change types, change methods and change outcomes. This dissertation also develops a conceptual model that proposes relationships and connections between the change types, change method and change outcomes that is assumed to enable successful change. To validate the research conceptual model, two hypotheses were developed and a self-administered survey was created and administered (paper survey and online). The respondents were professionals involved in change projects in the Central Florida region. The unit of analysis in this research was a completed change project. Respondents were asked to complete the survey for two different projects: a successful project and an unsuccessful project. Statistical processes were applied to verify the conceptual model and test the research hypotheses.Based on the data collected, exploratory factor analysis was used to verify the validity and reliability of the conceptual model measures. Results of the hypotheses testing revealed that there are relationships between the complexity of the change type and the use of change methods that significantly relate to successful change. The results also revealed that the alignment of the change type and change methods significantly relates to successful change.From the viewpoint of change project managers, the results of this dissertation have confirmed that the complexity of the change project type negatively correlates with change success and the increased use of change methods positively correlates with change success. The results also confirmed that the methods that highly correlate to change success address the following: (a) the situation that needs changing, (b) the proper implementation of change, (c) the establishment of suitable plans and controls to sustain change, and (d) the presence of a credible team leader who influences the major decisions during the change project.
Show less - Date Issued
- 2014
- Identifier
- CFE0005121, ucf:50691
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005121
- Title
- Improved Multi-Task Learning Based on Local Rademacher Analysis.
- Creator
-
Yousefi, Niloofar, Mollaghasemi, Mansooreh, Rabelo, Luis, Zheng, Qipeng, Anagnostopoulos, Georgios, Xanthopoulos, Petros, Georgiopoulos, Michael, University of Central Florida
- Abstract / Description
-
Considering a single prediction task at a time is the most commonly paradigm in machine learning practice. This methodology, however, ignores the potentially relevant information that might be available in other related tasks in the same domain. This becomes even more critical where facing the lack of a sufficient amount of data in a prediction task of an individual subject may lead to deteriorated generalization performance. In such cases, learning multiple related tasks together might offer...
Show moreConsidering a single prediction task at a time is the most commonly paradigm in machine learning practice. This methodology, however, ignores the potentially relevant information that might be available in other related tasks in the same domain. This becomes even more critical where facing the lack of a sufficient amount of data in a prediction task of an individual subject may lead to deteriorated generalization performance. In such cases, learning multiple related tasks together might offer a better performance by allowing tasks to leverage information from each other. Multi-Task Learning (MTL) is a machine learning framework, which learns multiple related tasks simultaneously to overcome data scarcity limitations of Single Task Learning (STL), and therefore, it results in an improved performance. Although MTL has been actively investigated by the machine learning community, there are only a few studies examining the theoretical justification of this learning framework. The focus of previous studies is on providing learning guarantees in the form of generalization error bounds. The study of generalization bounds is considered as an important problem in machine learning, and, more specifically, in statistical learning theory. This importance is twofold: (1) generalization bounds provide an upper-tail confidence interval for the true risk of a learning algorithm the latter of which cannot be precisely calculated due to its dependency to some unknown distribution P from which the data are drawn, (2) this type of bounds can also be employed as model selection tools, which lead to identifying more accurate learning models. The generalization error bounds are typically expressed in terms of the empirical risk of the learning hypothesis along with a complexity measure of that hypothesis. Although different complexity measures can be used in deriving error bounds, Rademacher complexity has received considerable attention in recent years, due to its superiority to other complexity measures. In fact, Rademacher complexity can potentially lead to tighter error bounds compared to the ones obtained by other complexity measures. However, one shortcoming of the general notion of Rademacher complexity is that it provides a global complexity estimate of the learning hypothesis space, which does not take into consideration the fact that learning algorithms, by design, select functions belonging to a more favorable subset of this space and, therefore, they yield better performing models than the worst case. To overcome the limitation of global Rademacher complexity, a more nuanced notion of Rademacher complexity, the so-called local Rademacher complexity, has been considered, which leads to sharper learning bounds, and as such, compared to its global counterpart, guarantees faster convergence rates in terms of number of samples. Also, considering the fact that locally-derived bounds are expected to be tighter than globally-derived ones, they can motivate better (more accurate) model selection algorithms.While the previous MTL studies provide generalization bounds based on some other complexity measures, in this dissertation, we prove excess risk bounds for some popular kernel-based MTL hypothesis spaces based on the Local Rademacher Complexity (LRC) of those hypotheses. We show that these local bounds have faster convergence rates compared to the previous Global Rademacher Complexity (GRC)-based bounds. We then use our LRC-based MTL bounds to design a new kernel-based MTL model, which enjoys strong learning guarantees. Moreover, we develop an optimization algorithm to solve our new MTL formulation. Finally, we run simulations on experimental data that compare our MTL model to some classical Multi-Task Multiple Kernel Learning (MT-MKL) models designed based on the GRCs. Since the local Rademacher complexities are expected to be tighter than the global ones, our new model is also expected to exhibit better performance compared to the GRC-based models.
Show less - Date Issued
- 2017
- Identifier
- CFE0006827, ucf:51778
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006827
- Title
- Total Ownership Cost Modeling of Technology Adoption Using System Dynamics: Implications for ERP Systems.
- Creator
-
Esmaeilian, Behzad, Karwowski, Waldemar, Mollaghasemi, Mansooreh, Xanthopoulos, Petros, Ahram, Tareq, Kincaid, John, University of Central Florida
- Abstract / Description
-
Investment in new technologies is considered by firms as a solution to improve their productivity, product and service quality and their competitive advantages in the global market. Unfortunately, not all technology adoption projects have met their intended objectives. The complexity of technology adoption along with little consideration of the long term cost of the technology, are among the factors that challenge companies while adopting a new technology. Companies often make new technology...
Show moreInvestment in new technologies is considered by firms as a solution to improve their productivity, product and service quality and their competitive advantages in the global market. Unfortunately, not all technology adoption projects have met their intended objectives. The complexity of technology adoption along with little consideration of the long term cost of the technology, are among the factors that challenge companies while adopting a new technology. Companies often make new technology adoption decision without enough attention to the total cost of the technology over its lifecycle. Sometimes poor decision making while adopting a new technology can result in substantial recurring loss impacts. Therefore, estimating the total cost of the technology is an important step in justifying the technology adoption. Total Ownership Cost (TOC) is a wildly-accepted financial metric which can be applied to study the costs associated with the new technology throughout its lifecycle. TOC helps companies analyze not only the acquisition and procurement cost of the technology, but also other cost components occurring over the technology usage and service stage. The point is that, technology adoption cost estimation is a complex process involving consideration of various aspects such as the maintenance cost, technology upgrade cost and the cost related to the human-resource. Assessing the association between the technology characteristics (technology upgrades over its life cycle, compatibility with other systems, technology life span, etc) and the TOC encompasses a high degree of complexity. The complexity exists because there are many factors affecting the cost over time. Sometimes decisions made today can have long lasting impact on the system costs and there is a lag between the time the decision is taken and when outcomes occur. An original contribution of this dissertation is development of a System Dynamics (SD) model to estimate the TOC associated with the new technology adoption. The SD model creates casual linkage and relationships among various aspects of the technology adoption process and allows decision makers to explore the impact of their decisions on the total cost that the technology brings into the company. The SD model presented in this dissertation composes of seven sub-models including (1) technology implementation efforts, (2) workforce training, (3) technology-related workforce hiring process, (4) preventive and corrective maintenance process, (5) technology upgrade, (6) impact of technology on system performance and (7) total ownership cost sub model. A case study of Enterprise Resource Planning (ERP) system adoption has been used to show the application of the SD model. The results of the model show that maintenance, upgrade and workforce hiring costs are among the major cost components in the ERP adoption case study presented in Chapter 4. The simulation SD model developed in this dissertation supports trade-off analysis and provides a tool for technology scenarios evaluation. The SD model presented here can be extended to provide a basis for developing a decision support system for technology evaluation.?
Show less - Date Issued
- 2013
- Identifier
- CFE0004836, ucf:49686
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004836
- Title
- A Retrospective Analysis and Field Study of Nanotechnology Related Ergonomic Risk in Industries Utilizing Nanomaterials.
- Creator
-
Greaves-Holmes, Wanda, Mccauley Bush, Pamela, Mollaghasemi, Mansooreh, Sala-Diakanda, Serge, Raghavan, Seetha, Ahram, Tareq, University of Central Florida
- Abstract / Description
-
The National Science Foundation estimates that two million skilled nanotechnology workers will be needed world wide by 2015 (-) one million of them in the United States (2001). In the absence of scientific clarity about the potential health effects of occupational exposure to nanoparticles, guidance in decision making about hazards, risk, and controls takes on new importance. Currently, guiding principles on personal protective equipment for workers who come in contact with nanomaterials have...
Show moreThe National Science Foundation estimates that two million skilled nanotechnology workers will be needed world wide by 2015 (-) one million of them in the United States (2001). In the absence of scientific clarity about the potential health effects of occupational exposure to nanoparticles, guidance in decision making about hazards, risk, and controls takes on new importance. Currently, guiding principles on personal protective equipment for workers who come in contact with nanomaterials have not been standardized universally. Utilizing the NASA-TLX, this dissertation investigates the adequacy and shortcomings of research efforts that seek to determine whether or not occupational exposure to nanomaterials while wearing personal protective equipment (PPE) is or is not potentially frustrating to the worker. While wearing PPE does the worker perceive additional effort, performance, physical, mental or temporal demands or are not impacted during task performance.
Show less - Date Issued
- 2012
- Identifier
- CFE0004497, ucf:49267
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004497