Current Search: Wang, Wei (x)
-
-
Title
-
HIGH SPEED TURBO TCM OFDM FOR UWB AND POWERLINE SYSTEM.
-
Creator
-
WANG, YANXIA, Wei, Lei, University of Central Florida
-
Abstract / Description
-
Turbo Trellis-Coded Modulation (TTCM) is an attractive scheme for higher data rate transmission, since it combines the impressive near Shannon limit error correcting ability of turbo codes with the high spectral efficiency property of TCM codes. We build a punctured parity-concatenated trellis codes in which a TCM code is used as the inner code and a simple parity-check code is used as the outer code. It can be constructed by simple repetition, interleavers, and TCM and functions as standard...
Show moreTurbo Trellis-Coded Modulation (TTCM) is an attractive scheme for higher data rate transmission, since it combines the impressive near Shannon limit error correcting ability of turbo codes with the high spectral efficiency property of TCM codes. We build a punctured parity-concatenated trellis codes in which a TCM code is used as the inner code and a simple parity-check code is used as the outer code. It can be constructed by simple repetition, interleavers, and TCM and functions as standard TTCM but with much lower complexity regarding real world implementation. An iterative bit MAP decoding algorithm is associated with the coding scheme. Orthogonal Frequency Division Multiplexing (OFDM) modulation has been a promising solution for efficiently capturing multipath energy in highly dispersive channels and delivering high data rate transmission. One of UWB proposals in IEEE P802.15 WPAN project is to use multi-band OFDM system and punctured convolutional codes for UWB channels supporting data rate up to 480Mb/s. The HomePlug Networking system using the medium of power line wiring also selects OFDM as the modulation scheme due to its inherent adaptability in the presence of frequency selective channels, its resilience to jammer signals, and its robustness to impulsive noise in power line channel. The main idea behind OFDM is to split the transmitted data sequence into N parallel sequences of symbols and transmit on different frequencies. This structure has the particularity to enable a simple equalization scheme and to resist to multipath propagation channel. However, some carriers can be strongly attenuated. It is then necessary to incorporate a powerful channel encoder, combined with frequency and time interleaving. We examine the possibility of improving the proposed OFDM system over UWB channel and HomePlug powerline channel by using our Turbo TCM with QAM constellation for higher data rate transmission. The study shows that the system can offer much higher spectral efficiency, for example, 1.2 Gbps for OFDM/UWB which is 2.5 times higher than the current standard, and 39 Mbps for OFDM/HomePlug1.0 which is 3 times higher than current standard. We show several essential requirements to achieve high rate such as frequency and time diversifications, multi-level error protection. Results have been confirmed by density evolution. The effect of impulsive noise on TTCM coded OFDM system is also evaluated. A modified iterative bit MAP decoder is provided for channels with impulsive noise with different impulsivity.
Show less
-
Date Issued
-
2006
-
Identifier
-
CFE0000943, ucf:46745
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0000943
-
-
Title
-
Prototype Development in General Purpose Representation and Association Machine Using Communication Theory.
-
Creator
-
Li, Huihui, Wei, Lei, Rahnavard, Nazanin, Vosoughi, Azadeh, Da Vitoria Lobo, Niels, Wang, Wei, University of Central Florida
-
Abstract / Description
-
Biological system study has been an intense research area in neuroscience and cognitive science for decades of years. Biological human brain is created as an intelligent system that integrates various types of sensor information and processes them intelligently. Neurons, as activated brain cells help the brain to make instant and rough decisions. From the 1950s, researchers start attempting to understand the strategies the biological system employs, then eventually translate them into machine...
Show moreBiological system study has been an intense research area in neuroscience and cognitive science for decades of years. Biological human brain is created as an intelligent system that integrates various types of sensor information and processes them intelligently. Neurons, as activated brain cells help the brain to make instant and rough decisions. From the 1950s, researchers start attempting to understand the strategies the biological system employs, then eventually translate them into machine-based algorithms. Modern computers have been developed to meet our need to handle computational tasks which our brains are not capable of performing with precision and speed. While in these existing man-made intelligent systems, most of them are designed for specific purposes. The modern computers solve sophistic problems based on fixed representation and association formats, instead of employing versatile approaches to explore the unsolved problems.Because of the above limitations of the conventional machines, General Purpose Representation and Association Machine (GPRAM) System is proposed to focus on using a versatile approach with hierarchical representation and association structures to do a quick and rough assessment on multitasks. Through lessons learned from neuroscience, error control coding and digital communications, a prototype of GPRAM system by employing (7,4) Hamming codes and short Low-Density Parity Check (LDPC) codes is implemented. Types of learning processes are presented, which prove the capability of GPRAM for handling multitasks.Furthermore, a study of low resolution simple patterns and face images recognition using an Image Processing Unit (IPU) structure for GPRAM system is presented. IPU structure consists of a randomly constructed LDPC code, an iterative decoder, a switch and scaling, and decision devices. All the input images have been severely degraded to mimic human Visual Information Variability (VIV) experienced in human visual system. The numerical results show that 1) IPU can reliably recognize simple pattern images in different shapes and sizes; 2) IPU demonstrates an excellent multi-class recognition performance for the face images with high degradation. Our results are comparable to popular machine learning recognition methods towards images without any quality degradation; 3) A bunch of methods have been discussed for improving IPU recognition performance, e.g. designing various detection and power scaling methods, constructing specific LDPC codes with large minimum girth, etc.Finally, novel methods to optimize M-ary PSK, M-ary DPSK, and dual-ring QAM signaling with non-equal symbol probabilities over AWGN channels are presented. In digital communication systems, MPSK, MDPSK, and dual-ring QAM signaling with equiprobable symbols have been well analyzed and widely used in practice. Inspired by bio-systems, we suggest investigating signaling with non-equiprobable symbol probabilities, since in bio-systems it is highly-unlikely to follow the ideal setting and uniform construction of single type of system. The results show that the optimizing system has lower error probabilities than conventional systems and the improvements are dramatic. Even though the communication systems are used as the testing environment, clearly, our final goal is to extend current communication theory to accommodate or better understand bio-neural information processing systems.
Show less
-
Date Issued
-
2017
-
Identifier
-
CFE0006758, ucf:51846
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0006758
-
-
Title
-
Managing IO Resource for Co-running Data Intensive Applications in Virtual Clusters.
-
Creator
-
Huang, Dan, Wang, Jun, Zhou, Qun, Sun, Wei, Zhang, Shaojie, Wang, Liqiang, University of Central Florida
-
Abstract / Description
-
Today Big Data computer platforms employ resource management systems such as Yarn, Torque, Mesos, and Google Borg to enable sharing the physical computing among many users or applications. Given virtualization and resource management systems, users are able to launch their applications on the same node with low mutual interference and management overhead on CPU and memory. However, there are still challenges to be addressed before these systems can be fully adopted to manage the IO resources...
Show moreToday Big Data computer platforms employ resource management systems such as Yarn, Torque, Mesos, and Google Borg to enable sharing the physical computing among many users or applications. Given virtualization and resource management systems, users are able to launch their applications on the same node with low mutual interference and management overhead on CPU and memory. However, there are still challenges to be addressed before these systems can be fully adopted to manage the IO resources in Big Data File Systems (BDFS) and shared network facilities. In this study, we mainly study on three IO management problems systematically, in terms of the proportional sharing of block IO in container-based virtualization, the network IO contention in MPI-based HPC applications and the data migration overhead in HPC workflows. To improve the proportional sharing, we develop a prototype system called BDFS-Container, by containerizing BDFS at Linux block IO level. Central to BDFS-Container, we propose and design a proactive IOPS throttling based mechanism named IOPS Regulator, which improves proportional IO sharing under the BDFS IO pattern by 74.4% on an average. In the aspect of network IO resource management, we exploit using virtual switches to facilitate network traffic manipulation and reduce mutual interference on the network for in-situ applications. In order to dynamically allocate the network bandwidth when it is needed, we adopt SARIMA-based techniques to analyze and predict MPI traffic issued from simulations. Third, to solve the data migration problem in small-medium sized HPC clusters, we propose to construct a sided IO path, named as SideIO, to explicitly direct analysis data to BDFS that co-locates computation with data. By experimenting with two real-world scientific workflows, SideIO completely avoids the most expensive data movement overhead and achieves up to 3x speedups compared with current solutions.
Show less
-
Date Issued
-
2018
-
Identifier
-
CFE0007195, ucf:52268
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0007195
-
-
Title
-
Emotional Intelligence in Organizational Social Networks.
-
Creator
-
Hermsdorfer, Andrea, Joseph, Dana, Fritzsche, Barbara, Wang, Wei, University of Central Florida
-
Abstract / Description
-
This study examined the role of emotional intelligence in relationships. Drawing on the notion that individuals who are high on emotional intelligence should have more social ties to others and stronger relationships within these ties, this study used social network analysis to specifically examine the extent to which emotional intelligence is positively related to social network centrality. I hypothesized that emotional intelligence would be positively related to centrality in four networks:...
Show moreThis study examined the role of emotional intelligence in relationships. Drawing on the notion that individuals who are high on emotional intelligence should have more social ties to others and stronger relationships within these ties, this study used social network analysis to specifically examine the extent to which emotional intelligence is positively related to social network centrality. I hypothesized that emotional intelligence would be positively related to centrality in four networks: advice, friendship, support, and positive affect presence. The hypotheses were not supported in this study, in spite of this, the incremental validity suggest a relationship between emotional intelligence and network centrality that may show up in future research.
Show less
-
Date Issued
-
2016
-
Identifier
-
CFE0006686, ucf:51927
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0006686
-
-
Title
-
detecting anomalies from big data system logs.
-
Creator
-
Lu, Siyang, Wang, Liqiang, Zhang, Shaojie, Zhang, Wei, Wu, Dazhong, University of Central Florida
-
Abstract / Description
-
Nowadays, big data systems (e.g., Hadoop and Spark) are being widely adopted by many domains for offering effective data solutions, such as manufacturing, healthcare, education, and media. A common problem about big data systems is called anomaly, e.g., a status deviated from normal execution, which decreases the performance of computation or kills running programs. It is becoming a necessity to detect anomalies and analyze their causes. An effective and economical approach is to analyze...
Show moreNowadays, big data systems (e.g., Hadoop and Spark) are being widely adopted by many domains for offering effective data solutions, such as manufacturing, healthcare, education, and media. A common problem about big data systems is called anomaly, e.g., a status deviated from normal execution, which decreases the performance of computation or kills running programs. It is becoming a necessity to detect anomalies and analyze their causes. An effective and economical approach is to analyze system logs. Big data systems produce numerous unstructured logs that contain buried valuable information. However manually detecting anomalies from system logs is a tedious and daunting task.This dissertation proposes four approaches that can accurately and automatically analyze anomalies from big data system logs without extra monitoring overhead. Moreover, to detect abnormal tasks in Spark logs and analyze root causes, we design a utility to conduct fault injection and collect logs from multiple compute nodes. (1) Our first method is a statistical-based approach that can locate those abnormal tasks and calculate the weights of factors for analyzing the root causes. In the experiment, four potential root causes are considered, i.e., CPU, memory, network, and disk I/O. The experimental results show that the proposed approach is accurate in detecting abnormal tasks as well as finding the root causes. (2) To give a more reasonable probability result and avoid ad-hoc factor weights calculating, we propose a neural network approach to analyze root causes of abnormal tasks. We leverage General Regression Neural Network (GRNN) to identify root causes for abnormal tasks. The likelihood of reported root causes is presented to users according to the weighted factors by GRNN. (3) To further improve anomaly detection by avoiding feature extraction, we propose a novel approach by leveraging Convolutional Neural Networks (CNN). Our proposed model can automatically learn event relationships in system logs and detect anomaly with high accuracy. Our deep neural network consists of logkey2vec embeddings, three 1D convolutional layers, a dropout layer, and max pooling. According to our experiment, our CNN-based approach has better accuracy compared to other approaches using Long Short-Term Memory (LSTM) and Multilayer Perceptron (MLP) on detecting anomaly in Hadoop DistributedFile System (HDFS) logs. (4) To analyze system logs more accurately, we extend our CNN-based approach with two attention schemes to detect anomalies in system logs. The proposed two attention schemes focus on different features from CNN's output. We evaluate our approaches with several benchmarks, and the attention-based CNN model shows the best performance among all state-of-the-art methods.
Show less
-
Date Issued
-
2019
-
Identifier
-
CFE0007673, ucf:52499
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0007673
-
-
Title
-
More is not always better: Unpacking the cognitive process underlying introspective psychological measurement.
-
Creator
-
Lapalme, Matthew, Wang, Wei, Fritzsche, Barbara, Jentsch, Florian, University of Central Florida
-
Abstract / Description
-
For decades, psychometricans have measured non-cognitive constructs with little attention paid to the underlying cognitive processes of response. Previous advancement in psychometrics suggests that traditional cognitive oriented approaches may, in fact, yield construct deficiency and spurious results when applied to non-cognitive measurement. This thesis highlights the importance of specifying an ideal point response process for non-cognitive measurement and empirically demonstrates that an...
Show moreFor decades, psychometricans have measured non-cognitive constructs with little attention paid to the underlying cognitive processes of response. Previous advancement in psychometrics suggests that traditional cognitive oriented approaches may, in fact, yield construct deficiency and spurious results when applied to non-cognitive measurement. This thesis highlights the importance of specifying an ideal point response process for non-cognitive measurement and empirically demonstrates that an ideal point response processes undergirds self-reported personality and attitude measurement. Furthermore, this thesis also advances current understanding on the limitations of ideal point assumptions by exploring the moderating effects of various individual differences in motivation and ability.
Show less
-
Date Issued
-
2015
-
Identifier
-
CFE0006223, ucf:51074
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0006223
-
-
Title
-
Leadership and Subordinate Engagement: A Meta-Analytic Examination of its Mechanisms using Self-Determination Theory.
-
Creator
-
Young, Henry, Wang, Wei, Joseph, Dana, Fritzsche, Barbara, University of Central Florida
-
Abstract / Description
-
Although past research has suggested ineffective leadership to be the most common reason for low levels of employee engagement, little is known about the mediating mechanisms underlying this relationship. To address this gap in research, I tested a theoretical model based on Self-Determination Theory (SDT; Deci (&) Ryan, 2000) in which two focal mechanisms, leader-member exchange (LMX) and empowerment, functioned in sequential order to predict the relationship between Full Range Leadership...
Show moreAlthough past research has suggested ineffective leadership to be the most common reason for low levels of employee engagement, little is known about the mediating mechanisms underlying this relationship. To address this gap in research, I tested a theoretical model based on Self-Determination Theory (SDT; Deci (&) Ryan, 2000) in which two focal mechanisms, leader-member exchange (LMX) and empowerment, functioned in sequential order to predict the relationship between Full Range Leadership and subordinate engagement. Results showed that transactional leadership had positive and negative indirect effects on engagement, suggesting that transactional leadership comprises a (")double-edged sword(") as a predictor of subordinate engagement. In contrast, the indirect effects between transformational leadership and engagement were consistently positive. As such, current mediation models used in leadership can benefit by drawing from SDT to investigate the unfolding process of leadership through sequential mediation.
Show less
-
Date Issued
-
2017
-
Identifier
-
CFE0006675, ucf:51250
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0006675
-
-
Title
-
Who is the best judge of personality: Investigating the role of relationship depth and observational breadth on the accuracy of third-party ratings.
-
Creator
-
Tindall, Mitchell, Jentsch, Kimberly, Szalma, James, Wang, Wei, Piccolo, Ronald, University of Central Florida
-
Abstract / Description
-
To date, the vast majority of research regarding personality in IO Psychology has relied on self-report assessments. Despite support for the utility of third-party assessments, IO Psychologists have only just begun extensive research in this area. Connelly and Ones (2010) conducted a meta-analysis that demonstrated that accuracy of third-party ratings improved as intimacy between the judge and the target grew. This remained true with the exception of predicting behavioral criteria, where non...
Show moreTo date, the vast majority of research regarding personality in IO Psychology has relied on self-report assessments. Despite support for the utility of third-party assessments, IO Psychologists have only just begun extensive research in this area. Connelly and Ones (2010) conducted a meta-analysis that demonstrated that accuracy of third-party ratings improved as intimacy between the judge and the target grew. This remained true with the exception of predicting behavioral criteria, where non-intimates maintained superior predictability (Connelly (&) Ones, 2010). This was later contradicted by a recent investigation that found the best predictive validity for third-party assessments when they are taken from personal acquaintances as opposed to work colleagues (Connelly (&) Hulsheger, 2012). The current study is intended to investigate how the depth of the relationship and breadth of behavioral observations differentially moderate the relationship between third-party personality assessments and accuracy criteria (i.e., self-other overlap, discriminant validity and behavior). Results indicate that both depth and breadth impact accuracy criteria and they do so differentially based on trait visibility and evaluativeness. These findings will be discussed along with practical implications and limitations of the following research.
Show less
-
Date Issued
-
2015
-
Identifier
-
CFE0006014, ucf:51007
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0006014
-
-
Title
-
Examining the impact of a fatigue intervention on job performance: A longitudinal study across United States hospitals.
-
Creator
-
Gregory, Megan, Salas, Eduardo, Wang, Wei, Fritzsche, Barbara, Burke, Shawn, University of Central Florida
-
Abstract / Description
-
Fatigue in healthcare providers has been linked to dangerous outcomes for patients, including medical errors, surgical complications, and accidents. Resident physicians, who traditionally work long hours on minimal sleep, are among the most fatigued. In attempt to mitigate the impact of fatigue on resident physician performance and improve patient safety, the Accreditation Council for Graduate Medical Education (ACGME) implemented a fatigue intervention program in 2011 for medical residency...
Show moreFatigue in healthcare providers has been linked to dangerous outcomes for patients, including medical errors, surgical complications, and accidents. Resident physicians, who traditionally work long hours on minimal sleep, are among the most fatigued. In attempt to mitigate the impact of fatigue on resident physician performance and improve patient safety, the Accreditation Council for Graduate Medical Education (ACGME) implemented a fatigue intervention program in 2011 for medical residency programs in the United States. This caused a significant decrease in the number of hours that first-year residents were permitted to work, compared with hours worked by first-year residents in prior years. While studies investigating the impact of the 2011 ACGME fatigue intervention on outcomes are limited thus far, some initial evidence seems to be promising. For instance, Pepper, Schweinfurth, and Herrin (2014) found that the rate of transfers to the intensive care unit after a code blue significantly decreased from pre- to post-intervention. However, it is not currently understood what variables may drive positive changes in patient outcomes, nor how long it may take for these changes to occur. Thus, the purpose of this study was to examine the effect that the 2011 ACGME fatigue intervention has had on job performance in healthcare providers in U.S. hospitals. The current study attempted to address this question by taking both a micro perspective, by drawing upon cognitive theories (Kahneman, 1973, 2011) and skill acquisition theory (Fitts, 1964; Fitts (&) Posner, 1967), as well as a macro perspective, by drawing upon organizational change theories (DiMaggio (&) Powell, 1983).This study combined public-use databases provided by the Center for Medicare and Medicaid Services (CMS). Specifically, 1,277 hospitals in the United States were examined over a five year period on job performance behaviors to determine if there was significant change from pre-intervention to post-intervention. Hospitals were categorized as control hospitals (n = 594) and intervention hospitals (n = 683). More specifically, intervention hospitals were analyzed according to their resident-to-patient bed ratio, using guidelines provided by Patel et al. (2014), including very low resident-to-bed ratio hospitals (n = 174), low resident-to-bed ratio hospitals (n = 287), high resident-to-bed ratio hospitals (n = 143), and very high resident-to-bed ratio hospitals (n = 79). Further, organizational size was examined as a moderator. The current study used discontinuous growth modeling (Bliese, 2008; Ployhart, 2014; J. D. Singer (&) Willett, 2003) to analyze the data, which allowed for investigation into the magnitude and rate of change from pre- to post-intervention. Results show that there was a significant improvement in employee job performance over time across both intervention and control hospitals. In particular, job performance significantly improved abruptly at the transition period (i.e., when the intervention was introduced) and continued to improve gradually throughout the post-intervention period; yet, these results held for both intervention and control hospitals. However, exploratory analyses comparing control hospitals to very high resident-to-bed ratio hospitals found that the latter group improved significantly more at the transition period in comparison to control hospitals. I therefore conclude that there may be some effect of the fatigue intervention on job performance, but this effect may be visible only in very high resident-to-bed ratio hospitals. Further, organizational size was not a significant moderator of the relationship. Future research is needed to confirm these findings.
Show less
-
Date Issued
-
2015
-
Identifier
-
CFE0005952, ucf:50807
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0005952
-
-
Title
-
Rethinking Routing and Peering in the era of Vertical Integration of Network Functions.
-
Creator
-
Dey, Prasun, Yuksel, Murat, Wang, Jun, Ewetz, Rickard, Zhang, Wei, Hasan, Samiul, University of Central Florida
-
Abstract / Description
-
Content providers typically control the digital content consumption services and are getting the most revenue by implementing an (")all-you-can-eat(") model via subscription or hyper-targeted advertisements. Revamping the existing Internet architecture and design, a vertical integration where a content provider and access ISP will act as unibody in a sugarcane form seems to be the recent trend. As this vertical integration trend is emerging in the ISP market, it is questionable if existing...
Show moreContent providers typically control the digital content consumption services and are getting the most revenue by implementing an (")all-you-can-eat(") model via subscription or hyper-targeted advertisements. Revamping the existing Internet architecture and design, a vertical integration where a content provider and access ISP will act as unibody in a sugarcane form seems to be the recent trend. As this vertical integration trend is emerging in the ISP market, it is questionable if existing routing architecture will suffice in terms of sustainable economics, peering, and scalability. It is expected that the current routing will need careful modifications and smart innovations to ensure effective and reliable end-to-end packet delivery. This involves new feature developments for handling traffic with reduced latency to tackle routing scalability issues in a more secure way and to offer new services at cheaper costs. Considering the fact that prices of DRAM or TCAM in legacy routers are not necessarily decreasing at the desired pace, cloud computing can be a great solution to manage the increasing computation and memory complexity of routing functions in a centralized manner with optimized expenses. Focusing on the attributes associated with existing routing cost models and by exploring a hybrid approach to SDN, we also compare recent trends in cloud pricing (for both storage and service) to evaluate whether it would be economically beneficial to integrate cloud services with legacy routing for improved cost-efficiency. In terms of peering, using the US as a case study, we show the overlaps between access ISPs and content providers to explore the viability of a future in terms of peering between the new emerging content-dominated sugarcane ISPs and the healthiness of Internet economics. To this end, we introduce meta-peering, a term that encompasses automation efforts related to peering (-) from identifying a list of ISPs likely to peer, to injecting control-plane rules, to continuous monitoring and notifying any violation (-) one of the many outcroppings of vertical integration procedure which could be offered to the ISPs as a standalone service.
Show less
-
Date Issued
-
2019
-
Identifier
-
CFE0007797, ucf:52351
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0007797
-
-
Title
-
Getting The Work Out of Workouts: Evaluating the Effectiveness and Outcomes of a Physical Exercise Motivational Intervention For Older Workers.
-
Creator
-
Sholar-Fetherlin, Brandon, Fritzsche, Barbara, Smither, Janan, Wang, Wei, Fragala, Maren, University of Central Florida
-
Abstract / Description
-
To mitigate their estimated $300 billion in annual health-related losses, many companies have instituted workplace wellness initiatives designed to promote physical activity among their employees, improving the overall health of their workforce. Though middle-aged and older workers may potentially enjoy the greatest physical, stress and cognitive benefits from regular exercise, workplace wellness programs have been less successful in attracting such employees. This study developed and tested...
Show moreTo mitigate their estimated $300 billion in annual health-related losses, many companies have instituted workplace wellness initiatives designed to promote physical activity among their employees, improving the overall health of their workforce. Though middle-aged and older workers may potentially enjoy the greatest physical, stress and cognitive benefits from regular exercise, workplace wellness programs have been less successful in attracting such employees. This study developed and tested a 6-week exercise motivation intervention designed to meet the needs of sedentary, older working adults and to determine what non-physical benefits might result from increased levels of physical exercise. The intervention, based primarily on Self-Determination Theory, included feedback on individually-made, realistic, process-specific exercise goals that and provided guidance from knowledgeable exercise professionals in addition to support group of socially-similar individuals to aid in coping and adherence. The intervention was built and delivered entirely online to fit better with the sample's considerable time demands. The motivational intervention was delivered to a sample of 30 mostly-older working adults and was successful in significantly improving activity levels and overall affect while decreasing stress. No significant differences were detected in measures of personal resources, work engagement, work effort and task performance. The implications and recommendations for future research are discussed.
Show less
-
Date Issued
-
2017
-
Identifier
-
CFE0006660, ucf:51235
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0006660
-
-
Title
-
VIE-ing for the Position: An Examination of the Motivational Antecedents of Response Distortion.
-
Creator
-
Mihm, David, Jentsch, Kimberly, Wang, Wei, Joseph, Dana, Piccolo, Ronald, University of Central Florida
-
Abstract / Description
-
Faking on self-report personality tests is a widespread practice which degrades the construct validity of personality tests when they are used in personnel selection contexts and may lead to suboptimal hiring decisions (Donovan, Dwight, (&) Hurtz, 2003; Schmit (&) Ryan, 1993). While much is known about the factors which enable job applicants to successfully engage in faking (Tett, Freund, Christiansen, Fox, (&) Coaster, 2012), far less is known about how specific applicant perceptions...
Show moreFaking on self-report personality tests is a widespread practice which degrades the construct validity of personality tests when they are used in personnel selection contexts and may lead to suboptimal hiring decisions (Donovan, Dwight, (&) Hurtz, 2003; Schmit (&) Ryan, 1993). While much is known about the factors which enable job applicants to successfully engage in faking (Tett, Freund, Christiansen, Fox, (&) Coaster, 2012), far less is known about how specific applicant perceptions throughout the hiring process influence their decision to engage in this practice. To this end, this study applied Vroom's (1964) expectancy theory to the study of applicant faking. Following the work of prior researchers (Peterson, Griffith, (&) Converse, 2009), this study incorporated an experimental paradigm in which participants were led to believe that they were completing a personality test as part of the hiring process.Results of the study suggested that applicant faking on personality tests within personnel selection contexts is largely driven by valence (the extent to which applicants perceive the job to which they are applying as desirable) and expectancy judgments (an applicant's self-efficacy regarding their ability to successfully engage in faking). However, the three-way interaction between valence, instrumentality, and expectancy judgments which forms the crux of Vroom's (1964) theory did not demonstrate a significant impact on subsequent faking. A positive relationship between cognitive ability and faking was also found, suggesting that highly intelligent job applicants are more prone to engage in this behavior. In addition, applicant integrity demonstrated no relationship to faking behavior, suggesting that job applicants may not view the practice as being unethical. The potential implications of these findings in real-world selection contexts was discussed.
Show less
-
Date Issued
-
2017
-
Identifier
-
CFE0006627, ucf:51298
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0006627
-
-
Title
-
Improving the performance of data-intensive computing on Cloud platforms.
-
Creator
-
Dai, Wei, Bassiouni, Mostafa, Zou, Changchun, Wang, Jun, Lin, Mingjie, Bai, Yuanli, University of Central Florida
-
Abstract / Description
-
Big Data such as Terabyte and Petabyte datasets are rapidly becoming the new norm for various organizations across a wide range of industries. The widespread data-intensive computing needs have inspired innovations in parallel and distributed computing, which has been the effective way to tackle massive computing workload for decades. One significant example is MapReduce, which is a programming model for expressing distributed computations on huge datasets, and an execution framework for data...
Show moreBig Data such as Terabyte and Petabyte datasets are rapidly becoming the new norm for various organizations across a wide range of industries. The widespread data-intensive computing needs have inspired innovations in parallel and distributed computing, which has been the effective way to tackle massive computing workload for decades. One significant example is MapReduce, which is a programming model for expressing distributed computations on huge datasets, and an execution framework for data-intensive computing on commodity clusters as well. Since it was originally proposed by Google, MapReduce has become the most popular technology for data-intensive computing. While Google owns its proprietary implementation of MapReduce, an open source implementation called Hadoop has gained wide adoption in the rest of the world. The combination of Hadoop and Cloud platforms has made data-intensive computing much more accessible and affordable than ever before.This dissertation addresses the performance issue of data-intensive computing on Cloud platforms from three different aspects: task assignment, replica placement, and straggler identification. Both task assignment and replica placement are subjects closely related to load balancing, which is one of the key issues that can significantly affect the performance of parallel and distributed applications. While task assignment schemes strive to balance data processing load among cluster nodes to achieve minimum job completion time, replica placement policies aim to assign block replicas to cluster nodes according to their processing capabilities to exploit data locality to the maximum extent. Straggler identification is also one of the crucial issues data-intensive computing has to deal with, as the overall performance of parallel and distributed applications is often determined by the node with the lowest performance. The results of extensive evaluation tests confirm that the schemes/policies proposed in this dissertation can improve the performance of data-intensive applications running on Cloud platforms.
Show less
-
Date Issued
-
2017
-
Identifier
-
CFE0006731, ucf:51896
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0006731