Current Search: Thompson, William (x)
View All Items
- Title
- A Framework for Measuring the Value-Added of Knowledge Processes with Analysis of Process Interactions and Dynamics.
- Creator
-
Cintron, Jose, Rabelo, Luis, Elshennawy, Ahmad, Thompson, William, Ajayi, Richard, University of Central Florida
- Abstract / Description
-
The most known and widely used methods use cash flows and tangible assets to measure the impact of investments in the organization's outputs. But in the last decade many newer organizations whose outputs are heavily dependent on information technology utilize knowledge as their main asset. These organizations' market values lie on the knowledge of its employees and their technological capabilities. In the current technology-based business landscape the value added by assets utilized for...
Show moreThe most known and widely used methods use cash flows and tangible assets to measure the impact of investments in the organization's outputs. But in the last decade many newer organizations whose outputs are heavily dependent on information technology utilize knowledge as their main asset. These organizations' market values lie on the knowledge of its employees and their technological capabilities. In the current technology-based business landscape the value added by assets utilized for generation of outputs cannot be appropriately measured and managed without considering the role that intangible assets and knowledge play in executing processes. The analysis of processes for comparison and decision making based on intangible value added can be accomplished using the knowledge required to execute processes. The measurement of value added by knowledge can provide a more realistic framework for analysis of processes where traditional cost methods are not appropriate, enabling managers to better allocate and control knowledge-based processes. Further consideration of interactions and complexity between proposed process alternatives can yield answers about where and when investments can improve value-added while dynamically providing higher returns on investment.
Show less - Date Issued
- 2013
- Identifier
- CFE0004983, ucf:49585
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004983
- Title
- The Relationship Between DNA's Physical Properties and the DNA Molecule's Harmonic Signature, and Related Motion in Water--A Computational Investigation.
- Creator
-
Boyer, Victor, Proctor, Michael, Thompson, William, Karwowski, Waldemar, Calloway, Richard, University of Central Florida
- Abstract / Description
-
This research investigates through computational methods whether the physical properties of DNA contribute to its harmonic signature, the uniqueness of that signature if present, and motion of the DNA molecule in water. When DNA is solvated in water at normal 'room temperature', it experiences a natural vibration due to the Brownian motion of the particles in the water colliding with the DNA. The null hypothesis is that there is no evidence to suggest a relationship between DNA's motion and...
Show moreThis research investigates through computational methods whether the physical properties of DNA contribute to its harmonic signature, the uniqueness of that signature if present, and motion of the DNA molecule in water. When DNA is solvated in water at normal 'room temperature', it experiences a natural vibration due to the Brownian motion of the particles in the water colliding with the DNA. The null hypothesis is that there is no evidence to suggest a relationship between DNA's motion and strand length, while the alternative hypothesis is that there is evidence to suggest a relationship between DNA's vibrational motion and strand length. In a similar vein to the first hypothesis, a second hypothesis posits that DNA's vibrational motion may be dependent on strand content. The nature of this relationship, whether linear, exponential, logarithmic or non-continuous is not hypothesized by this research but will be discovered by testing if there is evidence to suggest a relationship between DNA's motion and strand length. The research also aims to discover whether the motion of DNA, when it varies by strand length and/or content, is sufficiently unique to allow that DNA to be identified in the absence of foreknowledge of the type of DNA that is present in a manner similar to a signature. If there is evidence to suggest that there is a uniqueness in DNA's vibrational motion under varying DNA strand content or length, then additional experimentation will be needed to determine whether these variances are unique across small changes as well as large changes, or large changes only. Finally, the question of whether it might be possible to identify a strand of unique DNA by base pair configuration solely from its vibrational signature, or if not, whether it might be possible to identify changes existing inside of a known DNA strand (such as a corruption, transposition or mutational error) is explored. Given the computational approach to this research, the NAMD simulation package (released by the Theoretical and Computational Biophysics Group at the University of Illinois at Urbana-Champaign) with the CHARMM force field would be the most appropriate set of tools for this investigation (Phillips et al., 2005), and will therefore be the toolset used in this research. For visualization and manipulation of model data, the VMD (Visual Molecular Dynamics) package will be employed. Further, these tools may be optimized and/or be aware of nucleic acid structures, and are free. These tools appear to be sufficient for this task, with validated fidelity of the simulation to provide vibrational and pressure profile data that could be analyzed; sufficient capabilities to do what is being asked of it; speed, so that runs can be done in a reasonable period of time (weeks versus months); and parallelizability, so that the tool could be run over a clustered network of computers dedicated to the task to increase the speed and capacity of the simulations. The computer cluster enabled analysis of 30,000 to 40,000 atom systems spending more than 410,000 CPU computational hours of hundreds of nano second duration, experimental runs each sampled 500,000 times with two-femtosecond (")frames.(")Using Fourier transforms of run pressure readings into frequencies, the simulation investigation could not reject the null hypotheses that the frequencies observed in the system runs are independent on the DNA strand length or content being studied. To be clear, frequency variations were present in the in silicon replications of the DNA in ionized solutions, but we were unable to conclude that those variations were not due to other system factors. There were several tests employed to determine alternative factors that caused these variations. Chief among the factors is the possibility that the water box itself is the source of a large amount of vibrational noise that makes it difficult or impossible with the tools that we had at our disposal to isolate any signals emitted by the DNA strands. Assuming the water-box itself was a source of large amounts of vibrational noise, an emergent hypothesis was generated and additional post-hoc testing was undertaken to attempt to isolate and then filter the water box noise from the rest of the system frequencies. With conclusive results we found that the water box is responsible for the majority of the signals being recorded, resulting in very low signal amplitudes from the DNA molecules themselves. Using these low signal amplitudes being emitted by the DNA, we could not be conclusively uniquely associate either DNA length or content with the remaining observed frequencies. A brief look at a future possible isolation technique, wavelet analysis, was conducted. Finally, because these results are dependent on the tools at our disposal and hence by no means conclusive, suggestions for future research to expand on and further test these hypothesis are made in the final chapter.
Show less - Date Issued
- 2015
- Identifier
- CFE0005930, ucf:50835
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005930
- Title
- A Simulation-Based Evaluation Of Efficiency Strategies For A Primary Care Clinic With Unscheduled Visits.
- Creator
-
Bobbie, Afrifah, Karwowski, Waldemar, Thompson, William, Elshennawy, Ahmad, Mikusinski, Piotr, University of Central Florida
- Abstract / Description
-
In the health care industry, there are strategies to remove inefficiencies from the health delivery process called efficiency strategies. This dissertation proposed a simulation model to evaluate the impact of the efficiency strategies on a primary care clinic with unscheduled "walk-in" patient visits. The simulation model captures the complex characteristics of the Orlando Veteran's Affairs Medical Center (VAMC) primary care clinic. This clinic system includes different types of patients,...
Show moreIn the health care industry, there are strategies to remove inefficiencies from the health delivery process called efficiency strategies. This dissertation proposed a simulation model to evaluate the impact of the efficiency strategies on a primary care clinic with unscheduled "walk-in" patient visits. The simulation model captures the complex characteristics of the Orlando Veteran's Affairs Medical Center (VAMC) primary care clinic. This clinic system includes different types of patients, patient paths, and multiple resources that serve them. Added to the problem complexity is the presence of patient no-shows characteristics and unscheduled patient arrivals, a problem which has been until recently, largely neglected. The main objectives of this research were to develop a model that captures the complexities of the Orlando VAMC, evaluate alternative scenarios to work in unscheduled patient visits, and examine the impact of patient flow, appointment scheduling, and capacity management decisions on the performance of the primary care clinic system. The main results show that only a joint policy of appointment scheduling rules and patient flow decisions has a significant impact on the wait time of scheduled patients. It is recommended that in the future the clinic addresses the problem of serving additional walk-in patients from an integrated scheduling and patient flow viewpoint.
Show less - Date Issued
- 2016
- Identifier
- CFE0006443, ucf:51462
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006443
- Title
- Investigating The Relationship Between Adverse Events and Infrastructure Development in an Active War Theater Using Soft Computing Techniques.
- Creator
-
Cakit, Erman, Karwowski, Waldemar, Lee, Gene, Thompson, William, Mikusinski, Piotr, University of Central Florida
- Abstract / Description
-
The military recently recognized the importance of taking sociocultural factors into consideration. Therefore, Human Social Culture Behavior (HSCB) modeling has been getting much attention in current and future operational requirements to successfully understand the effects of social and cultural factors on human behavior. There are different kinds of modeling approaches to the data that are being used in this field and so far none of them has been widely accepted. HSCB modeling needs the...
Show moreThe military recently recognized the importance of taking sociocultural factors into consideration. Therefore, Human Social Culture Behavior (HSCB) modeling has been getting much attention in current and future operational requirements to successfully understand the effects of social and cultural factors on human behavior. There are different kinds of modeling approaches to the data that are being used in this field and so far none of them has been widely accepted. HSCB modeling needs the capability to represent complex, ill-defined, and imprecise concepts, and soft computing modeling can deal with these concepts. There is currently no study on the use of any computational methodology for representing the relationship between adverse events and infrastructure development investments in an active war theater. This study investigates the relationship between adverse events and infrastructure development projects in an active war theater using soft computing techniques including fuzzy inference systems (FIS), artificial neural networks (ANNs), and adaptive neuro-fuzzy inference systems (ANFIS) that directly benefits from their accuracy in prediction applications. Fourteen developmental and economic improvement project types were selected based on allocated budget values and a number of projects at different time periods, urban and rural population density, and total adverse event numbers at previous month selected as independent variables. A total of four outputs reflecting the adverse events in terms of the number of people killed, wounded, hijacked, and total number of adverse events has been estimated. For each model, the data was grouped for training and testing as follows: years between 2004 and 2009 (for training purpose) and year 2010 (for testing). Ninety-six different models were developed and investigated for Afghanistan and the country was divided into seven regions for analysis purposes. Performance of each model was investigated and compared to all other models with the calculated mean absolute error (MAE) values and the prediction accuracy within (&)#177;1 error range (difference between actual and predicted value). Furthermore, sensitivity analysis was performed to determine the effects of input values on dependent variables and to rank the top ten input parameters in order of importance.According to the the results obtained, it was concluded that the ANNs, FIS, and ANFIS are useful modeling techniques for predicting the number of adverse events based on historical development or economic projects' data. When the model accuracy was calculated based on the MAE for each of the models, the ANN had better predictive accuracy than FIS and ANFIS models in general as demonstrated by experimental results. The percentages of prediction accuracy with values found within (&)#177;1 error range around 90%. The sensitivity analysis results show that the importance of economic development projects varies based on the regions, population density, and occurrence of adverse events in Afghanistan. For the purpose of allocating resources and development of regions, the results can be summarized by examining the relationship between adverse events and infrastructure development in an active war theater; emphasis was on predicting the occurrence of events and assessing the potential impact of regional infrastructure development efforts on reducing number of such events.
Show less - Date Issued
- 2013
- Identifier
- CFE0004826, ucf:49757
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004826
- Title
- Modeling of Socio-Economic Factors and Adverse Events In an Active War Theater By Using a Cellular Automata Simulation Approach.
- Creator
-
Bozkurt, Halil, Karwowski, Waldemar, Lee, Gene, Thompson, William, Mikusinski, Piotr, University of Central Florida
- Abstract / Description
-
Department of Defense (DoD) implemented Human Social Cultural and Behavior (HSCB) program to meet the need to develop capability to understand, predict and shape human behavior among different cultures by developing a knowledge base, building models, and creating training capacity. This capability will allow decision makers to subordinate kinetic operations and promote non-kinetic operations to govern economic programs better in order to initiate efforts and development to address the...
Show moreDepartment of Defense (DoD) implemented Human Social Cultural and Behavior (HSCB) program to meet the need to develop capability to understand, predict and shape human behavior among different cultures by developing a knowledge base, building models, and creating training capacity. This capability will allow decision makers to subordinate kinetic operations and promote non-kinetic operations to govern economic programs better in order to initiate efforts and development to address the grievances among the displeased by adverse events. These non-kinetic operations include rebuilding indigenous institutions' bottom-up economic activity and constructing necessary infrastructure since the success in non-kinetic operations depends on understanding and using social and cultural landscape. This study aims to support decision makers by building a computational model to understand economic factors and their effect on adverse events.In this dissertation, the analysis demonstrates that the use of cellular automata has several significant contributions to support decision makers allocating development funds to stabilize regions with higher adverse event risks, and to better understand the complex socio-economic interactions with adverse events. Thus, this analysis was performed on a set of spatial data representing factors from social and economic data. In studying behavior using cellular automata, cells in the same neighborhood synchronously interact with each other to determine their next states, and small changes in iteration may yield to complex formations of adverse event risk after several iterations of time. The modeling methodology of cellular automata for social and economic analysis in this research was designed in two major implementation levels as follows: macro and micro-level. In the macro-level, the modeling framework integrates population, social, and economic sub-systems. The macro-level allows the model to use regionalized representations, while the micro-level analyses help to understand why the events have occurred. Macro-level subsystems support cellular automata rules to generate accurate predictions. Prediction capability of cellular automata is used to model the micro-level interactions between individual actors, which are represented by adverse events.The results of this dissertation demonstrate that cellular automata model is capable of evaluating socio-economic influences that result in changes in adverse events and identify location, time and impact of these events. Secondly, this research indicates that the socio-economic influences have different levels of impact on adverse events, defined by the number of people killed, wounded or hijacked. Thirdly, this research shows that the socio-economic, influences and adverse events that occurred in a given district have impacts on adverse events that occur in neighboring districts. The cellular automata modeling approach can be used to enhance the capability to understand and use human, social and behavioral factors by generating what-if scenarios to determine the impact of different infrastructure development projects to predict adverse events. Lastly, adverse events that could occur in upcoming years can be predicted to allow decision makers to deter these events or plan accordingly if these events do occur.
Show less - Date Issued
- 2013
- Identifier
- CFE0004820, ucf:49719
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004820
- Title
- Integrating Multiobjective Optimization with the Six Sigma Methodology for Online Process Control.
- Creator
-
Abualsauod, Emad, Geiger, Christopher, Elshennawy, Ahmad, Thompson, William, Moore, Karla, University of Central Florida
- Abstract / Description
-
Over the past two decades, the Define-Measure-Analyze-Improve-Control (DMAIC) framework of the Six Sigma methodology and a host of statistical tools have been brought to bear on process improvement efforts in today's businesses. However, a major challenge of implementing the Six Sigma methodology is maintaining the process improvements and providing real-time performance feedback and control after solutions are implemented, especially in the presence of multiple process performance objectives...
Show moreOver the past two decades, the Define-Measure-Analyze-Improve-Control (DMAIC) framework of the Six Sigma methodology and a host of statistical tools have been brought to bear on process improvement efforts in today's businesses. However, a major challenge of implementing the Six Sigma methodology is maintaining the process improvements and providing real-time performance feedback and control after solutions are implemented, especially in the presence of multiple process performance objectives. The consideration of a multiplicity of objectives in business and process improvement is commonplace and, quite frankly, necessary. However, balancing the collection of objectives is challenging as the objectives are inextricably linked, and, oftentimes, in conflict.Previous studies have reported varied success in enhancing the Six Sigma methodology by integrating optimization methods in order to reduce variability. These studies focus these enhancements primarily within the Improve phase of the Six Sigma methodology, optimizing a single objective. The current research and practice of using the Six Sigma methodology and optimization methods do little to address the real-time feedback and control for online process control in the case of multiple objectives.This research proposes an innovative integrated Six Sigma multiobjective optimization (SSMO) approach for online process control. It integrates the Six Sigma DMAIC framework with a nature-inspired optimization procedure that iteratively perturbs a set of decision variables providing feedback to the online process, eventually converging to a set of tradeoff process configurations that improves and maintains process stability. For proof of concept, the approach is applied to a general business process model (-) a well-known inventory management model (-) that is formally defined and specifies various process costs as objective functions. The proposed SSMO approach and the business process model are programmed and incorporated into a software platform. Computational experiments are performed using both three sigma (3?)-based and six sigma (6?)-based process control, and the results reveal that the proposed SSMO approach performs far better than the traditional approaches in improving the stability of the process. This research investigation shows that the benefits of enhancing the Six Sigma method for multiobjective optimization and for online process control are immense.
Show less - Date Issued
- 2013
- Identifier
- CFE0004968, ucf:49561
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004968
- Title
- Modeling Dense Storage Systems With Location Uncertainty.
- Creator
-
Awwad, Mohamed, Pazour, Jennifer, Elshennawy, Ahmad, Thompson, William, Leon, Steven, University of Central Florida
- Abstract / Description
-
This dissertation focuses on developing models to study the problem of searching and retrieving items in a dense storage environment. We consider a special storage configuration called an inverted T configuration, which has one horizontal and one vertical aisle. Inverted T configurations have fewer aisles than a traditional aisle-based storage environment. This increases the storage density; however, requires that some items to be moved out of the way to gain access to other more deeply...
Show moreThis dissertation focuses on developing models to study the problem of searching and retrieving items in a dense storage environment. We consider a special storage configuration called an inverted T configuration, which has one horizontal and one vertical aisle. Inverted T configurations have fewer aisles than a traditional aisle-based storage environment. This increases the storage density; however, requires that some items to be moved out of the way to gain access to other more deeply stored items. Such movement can result in item location uncertainty. When items are requested for retrieval in a dense storage environment with item location uncertainty, searching is required. Dense storage has a practical importance as it allows for the use of available space efficiently, which is especially important with the scarce and expensive space onboard of US Navy's ships that form a sea base. A sea base acts as a floating distribution center that provides ready issue material to forces ashore participating in various types of missions. The sea basing concept and the importance of a sea base's responsiveness is our main motivation to conduct this research.In chapter 2, we review three major bodies of literature: 1) sea based logistics, 2) dense storage and 3) search theory. Sea based logistics literature mostly focuses on the concept and the architecture of a sea base, with few papers developing mathematical models to solve operational problems of a sea base, including papers handling the logistical and sustainment aspects. Literature related to dense storage can be broken down into work dealing with a dense storage environment with an inverted T configuration and other papers dealing with other dense storage configurations. It was found that some of the dense storage literature was motivated by the same application, i.e. sea based logistics. Finally, we surveyed the vast search theory literature and classification of search environments. This research contributes to the intersection of these three bodies of literature. Specifically, this research, motivated by the application of sea basing, develops search heuristics for dense storage environments that require moving items out of the way during searching. In chapter 3, we present the problem statements. We study two single-searcher search problems. The first problem is searching for a single item in an inverted T dense storage environment. The second one is searching for one or more items in an inverted T storage environment with items stacked over each other in the vertical direction.In chapter 4, we present our first contribution. In this contribution we propose a search plan heuristic to search for a single item in an inverted T, k-deep dense storage system with the objective of decreasing the expected search time in such an environment. In this contribution, we define each storage environment entirely by the accessibility constant and the storeroom length. In addition, equations are derived to calculate each component of the search time equation that we propose: travel, put-back and repositioning. Two repositioning policies are studied. We find that a repositioning policy that uses the open aisle locations as temporary storage locations and requires put-back of these items while searching is recommended. This recommendation is because such a policy results in lower expected search time and lower variability than a policy that uses available space outside the storage area and handles put-back independently of the search process. Statistical analysis is used to analyze the numerical results of the first contribution and to analyze the performances of both repositioning polices. We derive the probability distribution of search times in a storeroom with small configurations in terms of the accessibility constant and length. It was found that this distribution can be approximated using a lognormal probability distribution with a certain mean and standard deviation. Knowing the probability distribution provides the decision makers with the full range of all possible probabilities of search times, which is useful for downstream planning operations.In chapter 5, we present the second contribution, in which we propose a search plan heuristic but for multiple items in an inverted T, k-deep storage system. Additionally, we consider stacking multiple items over each other. Stacking items over each other, increases the number of stored items and allows for the utilization of the vertical space. In this second contribution, we are using the repositioning policy that proved its superiority in the first contribution. This contribution investigates a more general and a much more challenging environment than the one studied in the first contribution. In the second environment, to gain access to some items, not only may other items need to be moved out of the way, but also the overall number of movements for items within the system will be highly affected by the number of items stacked over each other. In addition, the searcher is given a task that includes searching and retrieving a set of items, rather than just one item.For the second contribution, the performance of the search heuristic is analyzed through a Statistical Design of Experiments, and it was found that searching and retrieving multiple items instead of just a single item, would decrease the variability in search times for each storeroom configuration. Finally, in chapter 6, conclusions of this research and suggestions for future research directions are presented.
Show less - Date Issued
- 2015
- Identifier
- CFE0006256, ucf:51045
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006256
- Title
- A New Six Sigma Implementation Approach For Power Generation Gas Turbines Repair Process Development.
- Creator
-
Ghunakikar, Somesh, Elshennawy, Ahmad, Rabelo, Luis, Thompson, William, Furterer, Sandra, University of Central Florida
- Abstract / Description
-
Power Generation gas turbines used for heavy duty application mainly constitutes three modules; compressor, combustion and turbine. Typically, all these parts are designed by OEM companies for specific number of hours and cycles (also known as starts) before they become dysfunctional. In addition, Gas Turbine (GT) also have intended repair interval depending upon the type of part application and anticipated damages during service operation. Thus, GT parts need inspections and repair (overhaul...
Show morePower Generation gas turbines used for heavy duty application mainly constitutes three modules; compressor, combustion and turbine. Typically, all these parts are designed by OEM companies for specific number of hours and cycles (also known as starts) before they become dysfunctional. In addition, Gas Turbine (GT) also have intended repair interval depending upon the type of part application and anticipated damages during service operation. Thus, GT parts need inspections and repair (overhaul) after certain operating hours in order to recondition them so that they can be fit for reoperation to produce power. In this dissertation, a unique six sigma DFSS approach for development of GT parts overhaul is presented for total quality improvement. In this dissertation report, a unique six sigma DFSS approach is presented applicable to the development of repair processes for GT parts that can be used during overhauling of the parts. All six sigma phases of the proposed DFSS approach along with repair product development cycle are discussed. Various six sigma tools which yield significant benefits for the process users are also discussed. Importantly, a statistical probabilistic life analysis approach is proposed in order to verify the structural integrity of a repaired GT part. Finally a case study of GT axial compressor diaphragms (stators) to illustrate various phases and six sigma tools usage during each phase of the DFSS approach is discussed. The overall significant benefit of the proposed DFSS approach was to achieve total quality improvement to deliver final GT repair process, faster repair development cycle and end customer satisfaction.
Show less - Date Issued
- 2016
- Identifier
- CFE0006105, ucf:51199
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006105
- Title
- A New Paradigm Integrating Business Process Modeling and Use Case Modeling.
- Creator
-
Brown, Barclay, Karwowski, Waldemar, Thompson, William, Lee, Gene, O'Neal, Thomas, University of Central Florida
- Abstract / Description
-
The goal of this research is to develop a new paradigm integrating the practices of business process modeling and use case modeling. These two modeling approaches describe the behavior of organizations and systems, and their interactions, but rest on different paradigms and serve different needs. The base of knowledge and information required for each approach is largely common, however, so an integrated approach has advantages in efficiency, consistency and completeness of the overall...
Show moreThe goal of this research is to develop a new paradigm integrating the practices of business process modeling and use case modeling. These two modeling approaches describe the behavior of organizations and systems, and their interactions, but rest on different paradigms and serve different needs. The base of knowledge and information required for each approach is largely common, however, so an integrated approach has advantages in efficiency, consistency and completeness of the overall behavioral model. Both modeling methods are familiar and widely used. Business process modeling is often employed as a precursor to the development of a system to be used in a business organization. Business process modeling teams and stakeholders may spend months or years developing detailed business process models, expecting that these models will provide a useful base of information for system designers. Unfortunately, as the business process model is analyzed by the system designers, it is found that information needed to specify the functionality of the system does not exist in the business process model. System designers may then employ use case modeling to specify the needed system functionality, again spending significant time with stakeholders to gather the needed input. Stakeholders find this two-pass process redundant and wasteful of time and money since the input they provide to both modeling teams is largely identical, with each team capturing only the aspects relevant to their form of modeling. Developing a new paradigm and modeling approach that achieves the objectives of both business process modeling and use case modeling in an integrated form, in one analysis pass, results in time savings, increased accuracy and improved communication among all participants in the systems development process.Analysis of several case studies will show that inefficiency, wasted time and overuse of stakeholder resource time results from the separate application of business process modeling and use case modeling. A review of existing literature on the subject shows that while the problem of modeling both business process and use case information in a coordinated fashion has been recognized before, there are few if any approaches that have been proposed to reconcile and integrate the two methods. Based on both literature review and good modeling practices, a list of goals for the new paradigm and modeling approach forms the basis for the paradigm to be created.A grounded theory study is then conducted to analyze existing modeling approaches for both business processes and use cases and to provide an underlying theory on which to base the new paradigm. The two main innovations developed for the new paradigm are the usage process and the timebox. Usage processes allow system usages (use cases) to be identified as the business process model is developed, and the two to be shown in a combined process flow. Timeboxes allow processes to be positioned in time-relation to each other without the need to combine processes into higher level processes using causal relations that may not exist. The combination of usage processes and timeboxes allows any level of complex behavior to be modeled in one pass, without the redundancy and waste of separate business process and use case modeling work.Several pilot projects are conducted to test the new modeling paradigm in differing modeling situations with participants and subject matter experts asked to compare the traditional models with the new paradigm formulations.
Show less - Date Issued
- 2015
- Identifier
- CFE0005583, ucf:50270
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005583