Current Search: Xanthopoulos, Petros (x)
View All Items
Pages
- Title
- Data Mining Models for Tackling High Dimensional Datasets and Outliers.
- Creator
-
Panagopoulos, Orestis, Xanthopoulos, Petros, Rabelo, Luis, Zheng, Qipeng, Dechev, Damian, University of Central Florida
- Abstract / Description
-
High dimensional data and the presence of outliers in data each pose a serious challenge in supervised learning.Datasets with significantly larger number of features compared to samples arise in various areas, including business analytics and biomedical applications. Such datasets pose a serious challenge to standard statistical methods and render many existing classification techniques impractical. The generalization ability of many classification algorithms is compromised due to the so...
Show moreHigh dimensional data and the presence of outliers in data each pose a serious challenge in supervised learning.Datasets with significantly larger number of features compared to samples arise in various areas, including business analytics and biomedical applications. Such datasets pose a serious challenge to standard statistical methods and render many existing classification techniques impractical. The generalization ability of many classification algorithms is compromised due to the so-called curse of dimensionality. A new binary classification method called constrained subspace classifier (CSC) is proposed for such high dimensional datasets. CSC improves on an earlier proposed classification method called local subspace classifier (LSC) by accounting for the relative angle between subspaces while approximating the classes with individual subspaces. CSC is formulated as an optimization problem and can be solved by an efficient alternating optimization technique. Classification performance is tested in publicly available datasets. The improvement in classification accuracy over LSC shows the importance of considering the relative angle between the subspaces while approximating the classes. Additionally, CSC appears to be a robust classifier, compared to traditional two step methods that perform feature selection and classification in two distinct steps.Outliers can be present in real world datasets due to noise or measurement errors. The presence of outliers can affect the training phase of machine learning algorithms, leading to over-fitting which results in poor generalization ability. A new regression method called relaxed support vector regression (RSVR) is proposed for such datasets. RSVR is based on the concept of constraint relaxation which leads to increased robustness in datasets with outliers. RSVR is formulated using both linear and quadratic loss functions. Numerical experiments on benchmark datasets and computational comparisons with other popular regression methods depict the behavior of our proposed method. RSVR achieves better overall performance than support vector regression (SVR) in measures such as RMSE and $R^2_{adj}$ while being on par with other state-of-the-art regression methods such as robust regression (RR). Additionally, RSVR provides robustness for higher dimensional datasets which is a limitation of RR, the robust equivalent of ordinary least squares regression. Moreover, RSVR can be used on datasets that contain varying levels of noise.Lastly, we present a new novelty detection model called relaxed one-class support vector machines (ROSVMs) that deals with the problem of one-class classification in the presence of outliers.
Show less - Date Issued
- 2016
- Identifier
- CFE0006698, ucf:51920
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006698
- Title
- Methods for online feature selection for classification problems.
- Creator
-
Razmjoo, Alaleh, Zheng, Qipeng, Rabelo, Luis, Boginski, Vladimir, Xanthopoulos, Petros, University of Central Florida
- Abstract / Description
-
Online learning is a growing branch of machine learning which allows all traditional data miningtechniques to be applied on an online stream of data in real-time. In this dissertation, we presentthree efficient algorithms for feature ranking in online classification problems. Each of the methodsare tailored to work well with different types of classification tasks and have different advantages.The reason for this variety of algorithms is that like other machine learning solutions, there is...
Show moreOnline learning is a growing branch of machine learning which allows all traditional data miningtechniques to be applied on an online stream of data in real-time. In this dissertation, we presentthree efficient algorithms for feature ranking in online classification problems. Each of the methodsare tailored to work well with different types of classification tasks and have different advantages.The reason for this variety of algorithms is that like other machine learning solutions, there is usuallyno algorithm which works well for all types of tasks. The first method, is an online sensitivitybased feature ranking (SFR) which is updated incrementally, and is designed for classificationtasks with continuous features. We take advantage of the concept of global sensitivity and rankfeatures based on their impact on the outcome of the classification model. In the feature selectionpart, we use a two-stage filtering method in order to first eliminate highly correlated and redundantfeatures and then eliminate irrelevant features in the second stage. One important advantage of ouralgorithm is its generality, which means the method works for correlated feature spaces withoutpreprocessing. It can be implemented along with any single-pass online classification method withseparating hyperplane such as SVMs. In the second method, with help of probability theory wepropose an algorithm which measures the importance of the features by observing the changes inlabel prediction in case of feature substitution. A non-parametric version of the proposed methodis presented to eliminate the distribution type assumptions. These methods are application to alldata types including mixed feature spaces. At last, we present a class-based feature importanceranking method which evaluates the importance of each feature for each class, these sub-rankingsare further exploited to train an ensemble of classifiers. The proposed methods will be thoroughlytested using benchmark datasets and the results will be discussed in the last chapter.
Show less - Date Issued
- 2018
- Identifier
- CFE0007584, ucf:52567
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007584
- Title
- A System Dynamics Approach on Sustainability Assessment of the United States Urban Commuter Transportation.
- Creator
-
Ercan, Tolga, Tatari, Omer, Oloufa, Amr, Eluru, Naveen, Xanthopoulos, Petros, University of Central Florida
- Abstract / Description
-
Transportation sector is one of the largest emission sources and is a cause for human health concern due to the high dependency on personal vehicle in the U.S. Transportation mode choice studies are currently limited to micro- and regional-level boundaries, lacking of presenting a complete picture of the issues, and the root causes associated with urban passenger transportation choices in the U.S. Hence, system dynamics modeling approach is utilized to capture complex causal relationships...
Show moreTransportation sector is one of the largest emission sources and is a cause for human health concern due to the high dependency on personal vehicle in the U.S. Transportation mode choice studies are currently limited to micro- and regional-level boundaries, lacking of presenting a complete picture of the issues, and the root causes associated with urban passenger transportation choices in the U.S. Hence, system dynamics modeling approach is utilized to capture complex causal relationships among the critical system parameters affecting alternative transportation mode choices in the U.S. as well as to identify possible policy areas to improve alternative transportation mode choice rates for future years up to 2050. Considering the high degree of uncertainties inherent to the problem, multivariate sensitivity analysis is utilized to explore the effectiveness of existing and possible policy implications in dynamic model in the terms of their potential to increase transit ridership and locating critical parameters that influences the most on mode choice and emission rates. Finally, the dissertation advances the current body of knowledge by integrating discrete event simulation (multinomial fractional split model) and system dynamics for hybrid urban commuter transportation simulation to test new scenarios such as autonomous vehicle (AV) adoption along with traditional policy scenarios such as limiting lane-mile increase on roadways and introducing carbon tax policy on vehicle owners. Overall, the developed simulation models clearly indicate the importance of urban structures to secure the future of alternative transportation modes in the U.S. as the prevailing policy practices fail to change system behavior. Thus, transportation system needs a paradigm shift to radically change current impacts and the market penetration of AVs can be one of the reforms to provoke this transition since it is expected to revolutionize mode choice, emission trends, and the built environment.
Show less - Date Issued
- 2019
- Identifier
- CFE0007626, ucf:52554
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007626
- Title
- Simulation-Based Cognitive Workload Modeling and Evaluation of Adaptive Automation Invoking and Revoking Strategies.
- Creator
-
Rusnock, Christina, Geiger, Christopher, Karwowski, Waldemar, Xanthopoulos, Petros, Reinerman, Lauren, University of Central Florida
- Abstract / Description
-
In human-computer systems, such as supervisory control systems, large volumes of incoming and complex information can degrade overall system performance. Strategically integrating automation to offload tasks from the operator has been shown to increase not only human performance but also operator efficiency and safety. However, increased automation allows for increased task complexity, which can lead to high cognitive workload and degradation of situational awareness. Adaptive automation is...
Show moreIn human-computer systems, such as supervisory control systems, large volumes of incoming and complex information can degrade overall system performance. Strategically integrating automation to offload tasks from the operator has been shown to increase not only human performance but also operator efficiency and safety. However, increased automation allows for increased task complexity, which can lead to high cognitive workload and degradation of situational awareness. Adaptive automation is one potential solution to resolve these issues, while maintaining the benefits of traditional automation. Adaptive automation occurs dynamically, with the quantity of automated tasks changing in real-time to meet performance or workload goals. While numerous studies evaluate the relative performance of manual and adaptive systems, little attention has focused on the implications of selecting particular invoking or revoking strategies for adaptive automation. Thus, evaluations of adaptive systems tend to focus on the relative performance among multiple systems rather than the relative performance within a system.This study takes an intra-system approach specifically evaluating the relationship between cognitive workload and situational awareness that occurs when selecting a particular invoking-revoking strategy for an adaptive system. The case scenario is a human supervisory control situation that involves a system operator who receives and interprets intelligence outputs from multiple unmanned assets, and then identifies and reports potential threats and changes in the environment. In order to investigate this relationship between workload and situational awareness, discrete event simulation (DES) is used. DES is a standard technique in the analysis of systems, and the advantage of using DES to explore this relationship is that it can represent a human-computer system as the state of the system evolves over time. Furthermore, and most importantly, a well-designed DES model can represent the human operators, the tasks to be performed, and the cognitive demands placed on the operators. In addition to evaluating the cognitive workload to situational awareness tradeoff, this research demonstrates that DES can quite effectively model and predict human cognitive workload, specifically for system evaluation.This research finds that the predicted workload of the DES models highly correlates with well-established subjective measures and is more predictive of cognitive workload than numerous physiological measures. This research then uses the validated DES models to explore and predict the cognitive workload impacts of adaptive automation through various invoking and revoking strategies. The study provides insights into the workload-situational awareness tradeoffs that occur when selecting particular invoking and revoking strategies. First, in order to establish an appropriate target workload range, it is necessary to account for both performance goals and the portion of the workload-performance curve for the task in question. Second, establishing an invoking threshold may require a tradeoff between workload and situational awareness, which is influenced by the task's location on the workload-situational awareness continuum. Finally, this study finds that revoking strategies differ in their ability to achieve workload and situational awareness goals. For the case scenario examined, revoking strategies based on duration are best suited to improve workload, while revoking strategies based on revoking thresholds are better for maintaining situational awareness.
Show less - Date Issued
- 2013
- Identifier
- CFE0004927, ucf:49607
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004927
- Title
- A Posteriori and Interactive Approaches for Decision-Making with Multiple Stochastic Objectives.
- Creator
-
Bakhsh, Ahmed, Geiger, Christopher, Mollaghasemi, Mansooreh, Xanthopoulos, Petros, Wiegand, Rudolf, University of Central Florida
- Abstract / Description
-
Computer simulation is a popular method that is often used as a decision support tool in industry to estimate the performance of systems too complex for analytical solutions. It is a tool that assists decision-makers to improve organizational performance and achieve performance objectives in which simulated conditions can be randomly varied so that critical situations can be investigated without real-world risk. Due to the stochastic nature of many of the input process variables in simulation...
Show moreComputer simulation is a popular method that is often used as a decision support tool in industry to estimate the performance of systems too complex for analytical solutions. It is a tool that assists decision-makers to improve organizational performance and achieve performance objectives in which simulated conditions can be randomly varied so that critical situations can be investigated without real-world risk. Due to the stochastic nature of many of the input process variables in simulation models, the output from the simulation model experiments are random. Thus, experimental runs of computer simulations yield only estimates of the values of performance objectives, where these estimates are themselves random variables.Most real-world decisions involve the simultaneous optimization of multiple, and often conflicting, objectives. Researchers and practitioners use various approaches to solve these multiobjective problems. Many of the approaches that integrate the simulation models with stochastic multiple objective optimization algorithms have been proposed, many of which use the Pareto-based approaches that generate a finite set of compromise, or tradeoff, solutions. Nevertheless, identification of the most preferred solution can be a daunting task to the decision-maker and is an order of magnitude harder in the presence of stochastic objectives. However, to the best of this researcher's knowledge, there has been no focused efforts and existing work that attempts to reduce the number of tradeoff solutions while considering the stochastic nature of a set of objective functions.In this research, two approaches that consider multiple stochastic objectives when reducing the set of the tradeoff solutions are designed and proposed. The first proposed approach is an a posteriori approach, which uses a given set of Pareto optima as input. The second approach is an interactive-based approach that articulates decision-maker preferences during the optimization process. A detailed description of both approaches is given, and computational studies are conducted to evaluate the efficacy of the two approaches. The computational results show the promise of the proposed approaches, in that each approach effectively reduces the set of compromise solutions to a reasonably manageable size for the decision-maker. This is a significant step beyond current applications of decision-making process in the presence of multiple stochastic objectives and should serve as an effective approach to support decision-making under uncertainty.
Show less - Date Issued
- 2013
- Identifier
- CFE0004973, ucf:49574
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004973
- Title
- Applied Error Related Negativity: Single Electrode Electroencephalography in Complex Visual Stimuli.
- Creator
-
Sawyer, Benjamin, Karwowski, Waldemar, Hancock, Peter, Xanthopoulos, Petros, University of Central Florida
- Abstract / Description
-
Error related negativity (ERN) is a pronounced negative evoked response potential (ERP) that follows a known error. This neural pattern has the potential to communicate user awareness of incorrect actions within milliseconds. While the implications for human-machine interface and augmented cognition are exciting, the ERN has historically been evoked only in the laboratory using complex equipment while presenting simple visual stimuli such as letters and symbols. To effectively harness the...
Show moreError related negativity (ERN) is a pronounced negative evoked response potential (ERP) that follows a known error. This neural pattern has the potential to communicate user awareness of incorrect actions within milliseconds. While the implications for human-machine interface and augmented cognition are exciting, the ERN has historically been evoked only in the laboratory using complex equipment while presenting simple visual stimuli such as letters and symbols. To effectively harness the applied potential of the ERN, detection must be accomplished in complex environments using simple, preferably single-electrode, EEG systems feasible for integration into field and workplace-ready equipment. The present project attempted to use static photographs to evoke and successfully detect the ERN in a complex visual search task: motorcycle conspicuity. Drivers regularly fail to see motorcycles, with tragic results. To reproduce the issue in the lab, static pictures of traffic were presented, either including or not including motorcycles. A standard flanker letter task replicated from a classic ERN study (Gehring et al., 1993) was run alongside, with both studies requiring a binary response. Results showed that the ERN could be clearly detected in both tasks, even when limiting data to a single electrode in the absence of artifact correction. These results support the feasibility of applied ERN detection in complex visual search in static images. Implications and opportunities will be discussed, limitations of the study explained, and future directions explored.
Show less - Date Issued
- 2014
- Identifier
- CFE0005885, ucf:50886
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005885
- Title
- Systems Geometry: A Methodology for Analyzing Emergent System of Systems Behaviors.
- Creator
-
Bouwens, Christina, Sepulveda, Jose, Karwowski, Waldemar, Xanthopoulos, Petros, Kapucu, Naim, University of Central Florida
- Abstract / Description
-
Recent advancements in technology have led to the increased use of integrated 'systems of systems' (SoS) which link together independently developed and usable capabilities into an integrated system that exhibits new, emergent capabilities. However, the resulting SoS is often not well understood, where secondary and tertiary effects of tying systems together are often unpredictable and present severe consequences. The complexities of the composed system stem not only from system integration,...
Show moreRecent advancements in technology have led to the increased use of integrated 'systems of systems' (SoS) which link together independently developed and usable capabilities into an integrated system that exhibits new, emergent capabilities. However, the resulting SoS is often not well understood, where secondary and tertiary effects of tying systems together are often unpredictable and present severe consequences. The complexities of the composed system stem not only from system integration, but from a broad range of areas such as the competing objectives of different constituent system stakeholders, mismatched requirements from multiple process models, and architectures and interface approaches that are incompatible on multiple levels. While successful SoS development has proven to be a valuable tool for a wide range of applications, there are significant problems that remain with the development of such systems that need to be addressed during the early stages of engineering development within such environments. The purpose of this research is to define and demonstrate a methodology called Systems Geometry (SG) for analyzing SoS in the early stages of development to identify areas of potential unintended emergent behaviors as candidates for the employment of risk management strategies. SG focuses on three dimensions of interest when planning the development of a SoS: operational, functional, and technical. For Department of Defense (DoD) SoS, the operational dimension addresses the warfighter environment and includes characteristics such as mission threads and related command and control or simulation activities required to support the mission. The functional dimension highlights different roles associated with the development and use of the SoS, which could include a participant warfighter using the system, an analyst collecting data for system evaluation, or an infrastructure engineer working to keep the SoS infrastructure operational to support the users. Each dimension can be analyzed to understand roles, interfaces and activities. Cross-dimensional effects are of particular interest since such effects are less detectable and generally not addressed with conventional systems engineering (SE) methods. The literature review and the results of this study have identified key characteristics or dimensions that should be examined during SoS analysis and design. Although many methods exist for exploring system dimensions, there is a gap in techniques to explore cross-dimensional interactions and their effect on emergent SoS behaviors. The study has resulted in a methodology for capturing dimensional information and recommended analytical methods for intra-dimensional as well as cross-dimensional analysis. A problem-based approach to the system analysis is recommended combined with the application of matrix methods, network analysis and modeling techniques to provide intra- and cross-dimensional insight. The results of this research are applicable to a variety of socio-technical SoS analyses with applications in analysis, experimentation, test and evaluation and training.
Show less - Date Issued
- 2013
- Identifier
- CFE0005135, ucf:50696
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005135
- Title
- A Framework for Assessing the Quality and Effectiveness of A National Employment System: A Case Study of Saudi Arabia.
- Creator
-
Alsulami, Hemaid, Elshennawy, Ahmad, Lee, Gene, Xanthopoulos, Petros, Rahal, Ahmad, University of Central Florida
- Abstract / Description
-
National employment systems have been established in several countries to tackle the unemployment dilemma between citizens while the labor market flooded by expatriates. Lack of performance measurement indices among these systems caused failure to provide jobs to citizens and caused a state of confusion and dissatisfaction among employing entities. In Saudi Arabia, unemployment rate has increased in the last few decades and have since become a very political issue for the Saudi government....
Show moreNational employment systems have been established in several countries to tackle the unemployment dilemma between citizens while the labor market flooded by expatriates. Lack of performance measurement indices among these systems caused failure to provide jobs to citizens and caused a state of confusion and dissatisfaction among employing entities. In Saudi Arabia, unemployment rate has increased in the last few decades and have since become a very political issue for the Saudi government. Compared to other countries, the problem is different since many expatriates in Saudi Arabia are already employed in their markets while citizens are seeking jobs. In Saudi Arabia, there are 1.4 million unemployed citizens and 8 million expatriates working in the Saudi labor market. In 2011, the Saudi government established a new project for boosting citizen's employment in the private sector. This project has initiated an employment system that divides organizations into four categories (or rankings) based on their performance in employing Saudi citizens' job seekers. Organizations in the Saudi private sector are allocated services from Ministry of Labor depending on their ranking in the system. Consequently, there are mixed reactions from social and economic groups toward the system's significant impact on increasing the number of national (citizen) workers in the labor market.This study develops a framework to assess the quality and effectiveness of this government employment system and how the private sector has been affected after its implementation. The framework proposes a national employment index to help government leaders manage the labor market and reduce the unemployment rate. In addition, the framework is proposing employers satisfaction index to assist in improving the cooperation betweenivgovernment and private sector. Finally, the study demonstrates the various advantages and disadvantages of this concept and proposes solutions to improve the national employment system's quality and effectiveness.
Show less - Date Issued
- 2014
- Identifier
- CFE0005124, ucf:50674
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005124
- Title
- A Generic Framework For Multi-Method Modeling and Simulation of Complex Systems Using Discrete Event, System Dynamics and Agent Based Approaches.
- Creator
-
Mykoniatis, Konstantinos, Karwowski, Waldemar, Kincaid, John, Xanthopoulos, Petros, Akbas, Ilhan, University of Central Florida
- Abstract / Description
-
Decisions about Modeling and Simulation (M(&)S) of Complex Systems (CS) need to be evaluated prior to implementation. Discrete Event (DE), System Dynamics (SD), and Agent Based (AB) are three different M(&)S approaches widely applied to enhance decision-making of complex systems. However, single type M(&)S approaches can face serious challenges in representing the overall multidimensional nature of CS and may result in the design of oversimplified models excluding important factors....
Show moreDecisions about Modeling and Simulation (M(&)S) of Complex Systems (CS) need to be evaluated prior to implementation. Discrete Event (DE), System Dynamics (SD), and Agent Based (AB) are three different M(&)S approaches widely applied to enhance decision-making of complex systems. However, single type M(&)S approaches can face serious challenges in representing the overall multidimensional nature of CS and may result in the design of oversimplified models excluding important factors. Conceptual frameworks are necessary to offer useful guidance for combining and/or integrating different M(&)S approaches. Although several hybrid M(&)S frameworks have been described and are currently deployed, there is limited guidance on when, why and how to combine, and/or integrate DE, SD, and AB approaches. The existing hybrid frameworks focus more on how to deal with specific problems rather than to provide a generic way of applicability to various problem situations.The main aim of this research is to develop a generic framework for Multi-Method Modeling and Simulation of CS, which provides a practical guideline to integrated deployment or combination of DE, SD, and AB M(&)S methods. The key contributions of this dissertation include: (1) a meta-analysis literature review that identifies criteria and generic types of interaction relationships that are served as a basis for the development of a multi-method modeling and simulation framework; (2) a methodology and a framework that guide the user through the development of multi-method simulation models to solve CS problems; (3) an algorithm that recommends appropriate M(&)S method(s) based on the user selected criteria for user defined objective(s); (4) the implementation and evaluation of multi method simulation models based on the framework's recommendation in diverse domains; and (5) the comparison of multi-method simulation models created by following the multi-method modeling and simulation framework.It is anticipated that this research will inspire and motivate students, researchers, practitioners and decision makers engaged in M(&)S to become aware of the benefits of the cross-fertilization of the three key M(&)S methods.
Show less - Date Issued
- 2015
- Identifier
- CFE0005980, ucf:50762
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005980
- Title
- An Unsupervised Consensus Control Chart Pattern Recognition Framework.
- Creator
-
Haghtalab, Siavash, Xanthopoulos, Petros, Pazour, Jennifer, Rabelo, Luis, University of Central Florida
- Abstract / Description
-
Early identification and detection of abnormal time series patterns is vital for a number of manufacturing.Slide shifts and alterations of time series patterns might be indicative of some anomalyin the production process, such as machinery malfunction. Usually due to the continuous flow of data monitoring of manufacturing processes requires automated Control Chart Pattern Recognition(CCPR) algorithms. The majority of CCPR literature consists of supervised classification algorithms. Less...
Show moreEarly identification and detection of abnormal time series patterns is vital for a number of manufacturing.Slide shifts and alterations of time series patterns might be indicative of some anomalyin the production process, such as machinery malfunction. Usually due to the continuous flow of data monitoring of manufacturing processes requires automated Control Chart Pattern Recognition(CCPR) algorithms. The majority of CCPR literature consists of supervised classification algorithms. Less studies consider unsupervised versions of the problem. Despite the profound advantageof unsupervised methodology for less manual data labeling their use is limited due to thefact that their performance is not robust enough for practical purposes. In this study we propose the use of a consensus clustering framework. Computational results show robust behavior compared to individual clustering algorithms.
Show less - Date Issued
- 2014
- Identifier
- CFE0005178, ucf:50670
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005178
- Title
- Effects of Signal Probability on Multitasking-Based Distraction in Driving, Cyberattack (&) Battlefield Simulation.
- Creator
-
Sawyer, Benjamin, Karwowski, Waldemar, Hancock, Peter, Xanthopoulos, Petros, University of Central Florida
- Abstract / Description
-
Multitasking-based failures of perception and action are the focus of much research in driving, where they are attributed to distraction. Similar failures occur in contexts where the construct of distraction is little used. Such narrow application was attributed to methodology which cannot precisely account for experimental variables in time and space, limiting distraction's conceptual portability to other contexts. An approach based upon vigilance methodology was forwarded as a solution, and...
Show moreMultitasking-based failures of perception and action are the focus of much research in driving, where they are attributed to distraction. Similar failures occur in contexts where the construct of distraction is little used. Such narrow application was attributed to methodology which cannot precisely account for experimental variables in time and space, limiting distraction's conceptual portability to other contexts. An approach based upon vigilance methodology was forwarded as a solution, and highlighted a fundamental human performance question: Would increasing the signal probability (SP) of a secondary task increase associated performance, as is seen in the prevalence effect associated with vigilance tasks? Would it reduce associated performance, as is seen in driving distraction tasks? A series of experiments weighed these competing assumptions. In the first, a psychophysical task, analysis of accuracy and response data revealed an interaction between the number of concurrent tasks and SP of presented targets. The question was further tested in the applied contexts of driving, cyberattack and battlefield target decision-making. In line with previous prevalence effect inquiry, presentation of stimuli at higher SP led to higher accuracy. In line with existing distraction work, performance of higher numbers of concurrent tasks tended to elicit slower response times. In all experiments raising either number of concurrent tasks or SP of targets resulted in greater subjective workload, as measured by the NASA TLX, even when accompanied by improved accuracy. It would seem that (")distraction(") in previous experiments has been an aggregate effect including both delayed response time and prevalence-based accuracy effects. These findings support the view that superior experimental control of SP reveals nomothetic patterns of performance that allow better understanding and wider application of the distraction construct both within and in diverse contexts beyond driving.
Show less - Date Issued
- 2015
- Identifier
- CFE0006388, ucf:51522
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006388
- Title
- An investigation of physiological measures in a marketing decision task.
- Creator
-
Lerma, Nelson, Karwowski, Waldemar, Elshennawy, Ahmad, Xanthopoulos, Petros, Reinerman, Lauren, University of Central Florida
- Abstract / Description
-
The objective of the present study was to understand the use of physiological measures as an alternative to traditional market research tools, such as self-reporting measures and focus groups. For centuries, corporations and researchers have relied almost exclusively on traditional measures to gain insights into consumer behavior. Oftentimes, traditional methods have failed to accurately predict consumer demand, and this has prompted corporations to explore alternative methods that will...
Show moreThe objective of the present study was to understand the use of physiological measures as an alternative to traditional market research tools, such as self-reporting measures and focus groups. For centuries, corporations and researchers have relied almost exclusively on traditional measures to gain insights into consumer behavior. Oftentimes, traditional methods have failed to accurately predict consumer demand, and this has prompted corporations to explore alternative methods that will accurately forecast future sales. One the most promising alternative methods currently being investigated is the use of physiological measures as an indication of consumer preference. This field, also referred to as neuromarketing, has blended the principles of psychology, neuroscience, and market research to explore consumer behavior from a physiological perspective. The goal of neuromarketing is to capture consumer behavior through the use of physiological sensors. This study investigated the extent to which physiological measures where correlated to consumer preferences by utilizing five physiological sensors which included two neurological sensors (EEG and ECG) two hemodynamic sensors (TCD and fNIR) and one optic sensor (eye-tracking). All five physiological sensors were used simultaneously to capture and record physiological changes during four distinct marketing tasks. The results showed that only one physiological sensor, EEG, was indicative of concept type and intent to purchase. The remaining four physiological sensors did not show any significant differences for concept type or intent to purchase.Furthermore, Machine Learning Algorithms (MLAs) were used to determine the extent to which MLAs (Na(&)#239;ve Bayes, Multilayer Perceptron, K-Nearest Neighbor, and Logistic Regression) could classify physiological responses to self-reporting measures obtained during a marketing task. The results demonstrated that Multilayer Perceptron, on average, performed better than the other MLAs for intent to purchase and concept type. It was also evident that the models faired best with the most popular concept when categorizing the data based on intent to purchase or final selection. Overall, the four models performed well at categorizing the most popular concept and gave some indication to the extent to which physiological measures are capable of capturing intent to purchase. The research study was intended to help better understand the possibilities and limitations of physiological measures in the field of market research. Based on the results obtained, this study demonstrated that certain physiological sensors are capable of capturing emotional changes, but only when the emotional response between two concepts is significantly different. Overall, physiological measures hold great promise in the study of consumer behavior, providing great insight on the relationship between emotions and intentions in market research.
Show less - Date Issued
- 2015
- Identifier
- CFE0006345, ucf:51563
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006345
- Title
- a priori synthetic sampling for increasing classification sensitivity in imbalanced data sets.
- Creator
-
Rivera, William, Xanthopoulos, Petros, Wiegand, Rudolf, Karwowski, Waldemar, Kincaid, John, University of Central Florida
- Abstract / Description
-
Building accurate classifiers for predicting group membership is made difficult when data is skewedor imbalanced which is typical of real world data sets. The classifier has the tendency to be biased towards the over represented group as a result. This imbalance is considered a class imbalance problem which will induce bias into the classifier particularly when the imbalance is high.Class imbalance data usually suffers from data intrinsic properties beyond that of imbalance alone.The problem...
Show moreBuilding accurate classifiers for predicting group membership is made difficult when data is skewedor imbalanced which is typical of real world data sets. The classifier has the tendency to be biased towards the over represented group as a result. This imbalance is considered a class imbalance problem which will induce bias into the classifier particularly when the imbalance is high.Class imbalance data usually suffers from data intrinsic properties beyond that of imbalance alone.The problem is intensified with larger levels of imbalance most commonly found in observationalstudies. Extreme cases of class imbalance are commonly found in many domains including frauddetection, mammography of cancer and post term births. These rare events are usually the mostcostly or have the highest level of risk associated with them and are therefore of most interest.To combat class imbalance the machine learning community has relied upon embedded, data preprocessing and ensemble learning approaches. Exploratory research has linked several factorsthat perpetuate the issue of misclassification in class imbalanced data. However, there remainsa lack of understanding between the relationship of the learner and imbalanced data among thecompeting approaches. The current landscape of data preprocessing approaches have appeal dueto the ability to divide the problem space in two which allows for simpler models. However, mostof these approaches have little theoretical bases although in some cases there is empirical evidence supporting the improvement.The main goals of this research is to introduce newly proposed a priori based re-sampling methodsthat improve concept learning within class imbalanced data. The results in this work highlightthe robustness of these techniques performance within publicly available data sets from differentdomains containing various levels of imbalance. In this research the theoretical and empiricalreasons are explored and discussed.
Show less - Date Issued
- 2015
- Identifier
- CFE0006169, ucf:51129
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006169
- Title
- A Simulation-Based Task Analysis using Agent-Based, Discrete Event and System Dynamics simulation.
- Creator
-
Angelopoulou, Anastasia, Karwowski, Waldemar, Kincaid, John, Xanthopoulos, Petros, Hancock, Peter, University of Central Florida
- Abstract / Description
-
Recent advances in technology have increased the need for using simulation models to analyze tasks and obtain human performance data. A variety of task analysis approaches and tools have been proposed and developed over the years. Over 100 task analysis methods have been reported in the literature. However, most of the developed methods and tools allow for representation of the static aspects of the tasks performed by expert system-driven human operators, neglecting aspects of the work...
Show moreRecent advances in technology have increased the need for using simulation models to analyze tasks and obtain human performance data. A variety of task analysis approaches and tools have been proposed and developed over the years. Over 100 task analysis methods have been reported in the literature. However, most of the developed methods and tools allow for representation of the static aspects of the tasks performed by expert system-driven human operators, neglecting aspects of the work environment, i.e. physical layout, and dynamic aspects of the task. The use of simulation can help face the new challenges in the field of task analysis as it allows for simulation of the dynamic aspects of the tasks, the humans performing them, and their locations in the environment. Modeling and/or simulation task analysis tools and techniques have been proven to be effective in task analysis, workload, and human reliability assessment. However, most of the existing task analysis simulation models and tools lack features that allow for consideration of errors, workload, level of operator's expertise and skills, among others. In addition, the current task analysis simulation tools require basic training on the tool to allow for modeling the flow of task analysis process and/or error and workload assessment. The modeling process is usually achieved using drag and drop functionality and, in some cases, programming skills.This research focuses on automating the modeling process and simulating individuals (or groups of individuals) performing tasks in a dynamic work environment in any domain. The main objective of this research is to develop a universal tool that allows for modeling and simulation of task analysis models in a short amount of time with limited need for training or knowledge of modeling and simulation theory. A Universal Task Analysis Simulation Modeling (UTASiMo) tool can be used for automatically generating simulation models that analyze the tasks performed by human operators. UTASiMo is a multi-method modeling and simulation tool developed as a combination of agent-based, discrete event, and system dynamics simulation models. A generic multi-method modeling and simulation framework, named 3M(&)S Framework, as well as the Unified Modeling Language have been used for the design of the conceptual model and the implementation of the simulation tool. UTASiMo-generated models are dynamically created during run-time based on user inputs. The simulation results include estimations of operator workload, task completion time, and probability of human errors based on human operator variability and task structure.
Show less - Date Issued
- 2015
- Identifier
- CFE0006252, ucf:51040
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006252
- Title
- A Framework for Measuring and Analyzing Customer Satisfaction at Computer Service Company using Lean Six Sigma.
- Creator
-
Abboodi, Mohammed, Elshennawy, Ahmad, Rabelo, Luis, Xanthopoulos, Petros, University of Central Florida
- Abstract / Description
-
The computer service industry has been expanding dramatically due to the increase in the number of computing machineries in the last two decades. The entrance of large size companies in the market and the release of online tools that have the ability for diagnosing and troubleshooting hardware and software issues have boosted the competition. In the meantime, many of the small and medium size companies find themselves unable to keep their customers satisfied since their competitors provide...
Show moreThe computer service industry has been expanding dramatically due to the increase in the number of computing machineries in the last two decades. The entrance of large size companies in the market and the release of online tools that have the ability for diagnosing and troubleshooting hardware and software issues have boosted the competition. In the meantime, many of the small and medium size companies find themselves unable to keep their customers satisfied since their competitors provide high quality service with lower cost. The lack of a good measurement system to assess and analyze the satisfaction level with the provided service is the fundamental cause of customer decline. The aim of this study is to construct a robust framework to measure customer satisfaction and highlight the root causes of dissatisfaction in the computer service sector. This framework brings together the key aspects of Six Sigma and SERVQUAL instruments into a structured approach to measure and analyze customer satisfaction with computer services. It deploys the DMAIC problem solving methodology along with the SERVQUAL model, which contributes service dimensions and the Gap Analyze technique. Literature review indicates there have not been enough studies conducted to integrate Lean Six Sigma with SERVQUAL. To explore the effectiveness of the current framework, a computer service company has been selected. The satisfaction levels are calculated and the root causes of dissatisfaction have been identified. With a low overall customer satisfaction level, the company did not fulfill their customer requirements due to five major causes. Eliminating those causes will boost customer satisfaction, reduce the cost of acquiring new customers and improve the company performance in general.
Show less - Date Issued
- 2014
- Identifier
- CFE0005117, ucf:50751
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005117
- Title
- A framework for interoperability on the United States electric grid infrastructure.
- Creator
-
Laval, Stuart, Rabelo, Luis, Zheng, Qipeng, Xanthopoulos, Petros, Ajayi, Richard, University of Central Florida
- Abstract / Description
-
Historically, the United States (US) electric grid has been a stable one-way power delivery infrastructure that supplies centrally-generated electricity to its predictably consuming demand. However, the US electric grid is now undergoing a huge transformation from a simple and static system to a complex and dynamic network, which is starting to interconnect intermittent distributed energy resources (DERs), portable electric vehicles (EVs), and load-altering home automation devices, that...
Show moreHistorically, the United States (US) electric grid has been a stable one-way power delivery infrastructure that supplies centrally-generated electricity to its predictably consuming demand. However, the US electric grid is now undergoing a huge transformation from a simple and static system to a complex and dynamic network, which is starting to interconnect intermittent distributed energy resources (DERs), portable electric vehicles (EVs), and load-altering home automation devices, that create bidirectional power flow or stochastic load behavior. In order for this grid of the future to effectively embrace the high penetration of these disruptive and fast-responding digital technologies without compromising its safety, reliability, and affordability, plug-and-play interoperability within the field area network must be enabled between operational technology (OT), information technology (IT), and telecommunication assets in order to seamlessly and securely integrate into the electric utility's operations and planning systems in a modular, flexible, and scalable fashion. This research proposes a potential approach to simplifying the translation and contextualization of operational data on the electric grid without being routed to the utility datacenter for a control decision. This methodology integrates modern software technology from other industries, along with utility industry-standard semantic models, to overcome information siloes and enable interoperability. By leveraging industrial engineering tools, a framework is also developed to help devise a reference architecture and use-case application process that is applied and validated at a US electric utility.
Show less - Date Issued
- 2015
- Identifier
- CFE0005647, ucf:50193
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005647
- Title
- Climate Change Impacts on Rainfed Corn Production in Malawi.
- Creator
-
Msowoya, Kondwani, Madani Larijani, Kaveh, Wang, Dingbao, Xanthopoulos, Petros, University of Central Florida
- Abstract / Description
-
Agriculture is the mainstay of the economy in Malawi and accounts for 40% of the Gross Domestic Product (GDP) and 90% of the export revenues. Corn (maize) is the major cereal crop grown as staple food under rainfed conditions, covers over 92% of the total agricultural area, and contributes 54% of the caloric intake. Corn production is the principle occupation and major source of income for over 85% of the total population in Malawi. Issues of hunger and food insecurity for the entire nation...
Show moreAgriculture is the mainstay of the economy in Malawi and accounts for 40% of the Gross Domestic Product (GDP) and 90% of the export revenues. Corn (maize) is the major cereal crop grown as staple food under rainfed conditions, covers over 92% of the total agricultural area, and contributes 54% of the caloric intake. Corn production is the principle occupation and major source of income for over 85% of the total population in Malawi. Issues of hunger and food insecurity for the entire nation are associated with corn scarcity and low production. Global warming is expected to cause climate change in Malawi, including changes in temperature and precipitation amounts and patterns. These climate changes are expected to affect corn production in Malawi. This study evaluates the impacts of climate change on rainfed corn production in Malawi. Lilongwe District, with about 1,045 square miles of agriculture area, has been selected as a representative area. First, outputs of 15 General Circulation Models (GCMs) under different emission scenarios are statistically downscaled. For this purpose, a weather generator (LARS-WG) is calibrated and validated for the study area and daily precipitation as well as minimum and maximum temperature are projected for 15 GCMs for three time horizons of 2020s, 2050s and 2090s. Probability assessment of bounded range with known distributions is used to deal with the uncertainties of GCMs' outputs. These GCMs outputs are weighted by considering the ability of each model to simulate historical records. AquaCrop, a new model developed by FAO that simulates the crop yield response to water deficit conditions, is employed to assess potential rainfed corn production in the study area with and without climate change. Study results indicate an average temperature increase of 0.52 to 0.94oC, 1.26 to 2.20oC and 1.78 to 3.58oC in the near-term (2020s), mid-term (2050s) and long-term (2090s) future, respectively. The expected changes in precipitation during these periods are -17 to 11%, -26 to 0%, and -29 to -3%. Corn yields are expected to change by -8.11 to 0.53%, -7.25 to -14.33%, and -13.19 to -31.86%, during the same time periods. The study concludes with suggestion of some adaptation strategies that the Government of Malawi could consider to improve national food security under climate change.
Show less - Date Issued
- 2013
- Identifier
- CFE0005036, ucf:50011
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005036
- Title
- Cost-Sensitive Learning-based Methods for Imbalanced Classification Problems with Applications.
- Creator
-
Razzaghi, Talayeh, Xanthopoulos, Petros, Karwowski, Waldemar, Pazour, Jennifer, Mikusinski, Piotr, University of Central Florida
- Abstract / Description
-
Analysis and predictive modeling of massive datasets is an extremely significant problem that arises in many practical applications. The task of predictive modeling becomes even more challenging when data are imperfect or uncertain. The real data are frequently affected by outliers, uncertain labels, and uneven distribution of classes (imbalanced data). Such uncertainties createbias and make predictive modeling an even more difficult task. In the present work, we introduce a cost-sensitive...
Show moreAnalysis and predictive modeling of massive datasets is an extremely significant problem that arises in many practical applications. The task of predictive modeling becomes even more challenging when data are imperfect or uncertain. The real data are frequently affected by outliers, uncertain labels, and uneven distribution of classes (imbalanced data). Such uncertainties createbias and make predictive modeling an even more difficult task. In the present work, we introduce a cost-sensitive learning method (CSL) to deal with the classification of imperfect data. Typically, most traditional approaches for classification demonstrate poor performance in an environment with imperfect data. We propose the use of CSL with Support Vector Machine, which is a well-known data mining algorithm. The results reveal that the proposed algorithm produces more accurate classifiers and is more robust with respect to imperfect data. Furthermore, we explore the best performance measures to tackle imperfect data along with addressing real problems in quality control and business analytics.
Show less - Date Issued
- 2014
- Identifier
- CFE0005542, ucf:50298
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005542
- Title
- A Framework for the Development of a Model for Successful, Sustained Lean Implementation and Improvement.
- Creator
-
Sisson, Julie, Elshennawy, Ahmad, Rabelo, Luis, Xanthopoulos, Petros, Porter, Robert, University of Central Florida
- Abstract / Description
-
Lean is a business philosophy focused on shortening lead times by removing waste and concentrating on value-added processes. When implemented successfully, it not only allows for cost reduction while improving quality, but it can also position a company to achieve tremendous growth. The problem is that though many companies are attempting to implement lean, it is estimated that only 2-3% are achieving the desired level of success. The purpose of this research is to identify the key...
Show moreLean is a business philosophy focused on shortening lead times by removing waste and concentrating on value-added processes. When implemented successfully, it not only allows for cost reduction while improving quality, but it can also position a company to achieve tremendous growth. The problem is that though many companies are attempting to implement lean, it is estimated that only 2-3% are achieving the desired level of success. The purpose of this research is to identify the key interrelated components of successful lean transformation. To this end, a thorough literature review was conducted and the findings indicate six key constructs that can act as enablers or inhibitors to implementing and sustaining lean. A theoretical framework was developed that integrates these constructs and develops research propositions for each. A multiple-case study analysis then was used to test the framework on four companies that have achieved successful, sustained results from their lean implementation in order to validate the model. The resulting model provides companies who are planning to implement lean with tangible actions that can be taken to make their lean transformations more successful.
Show less - Date Issued
- 2014
- Identifier
- CFE0005262, ucf:50608
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005262
- Title
- Safety Climate and Safety Outcomes in Aircraft Maintenance: A Mediating Effect of Employee Turnover and Safety Motivation.
- Creator
-
Alnoaimi, Muhanna, Karwowski, Waldemar, Xanthopoulos, Petros, Hancock, Peter, Mikusinski, Piotr, University of Central Florida
- Abstract / Description
-
Aircraft maintenance is viewed as a critical safety component in general and military aviation industries, and thus it is crucial to identify the factors that may affect aircraft maintenance. Because the safety climate is considered as a leading indicator of safety performance and safety outcomes, this study utilized this safety climate approach to develop a model which can explain the relationships between employee turnover, safety motivation, self-reported unsafe acts, reporting unsafe...
Show moreAircraft maintenance is viewed as a critical safety component in general and military aviation industries, and thus it is crucial to identify the factors that may affect aircraft maintenance. Because the safety climate is considered as a leading indicator of safety performance and safety outcomes, this study utilized this safety climate approach to develop a model which can explain the relationships between employee turnover, safety motivation, self-reported unsafe acts, reporting unsafe behaviors, incidents, and injuries in the aviation maintenance environment. This study included a sample of 283 technicians in military aircraft maintenance units who participated in a cross-sectional random survey. Data collected were analyzed using Exploratory Factor Analysis (EFA) and Structural Equation Modeling (SEM) techniques. A structural model that fitted the data was developed which predicted 64% of the variance in employee turnover, 7% of the variance in safety motivation, 20% of the variance in unsafe acts, 41% of the variance in reporting unsafe behavior, and 21% of the variance in workplace injuries. The results indicate employees who report a perception of high turnover exhibit decreased safety motivation and increased unsafe acts which lead to higher levels of workplace injuries. The perception of safety climate was identified as an antecedent to safety performance and safety outcomes. Additionally, the effects of control variables such as age and education were tested. The implications for safety management in aircraft maintenance were also discussed. This study provides directions for future research on the turnover of aircraft maintenance technicians, safety performance, and safety outcomes.
Show less - Date Issued
- 2015
- Identifier
- CFE0005753, ucf:50097
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005753