Current Search: processing (x)
View All Items
Pages
- Title
- The Impact of Technology on Management Control: Degradation, Empowerment, or Technology Dominance?.
- Creator
-
Canada, Joseph, Arnold, Vicky, Roberts, Robin, Sutton, Steven, Benford, Tanya, University of Central Florida
- Abstract / Description
-
The evolution of technology brings with it the evolution of business processes. Without a doubt, technology changes how work is performed. At first glance, workplace technology appears to be a great boon to society. However, research presents opposing views on how workplace technologies impact the individual. One perspective argues that organizations utilize technology to redesign work processes, such that the worker requires less skill, autonomy, and compensation. The opposing perspective...
Show moreThe evolution of technology brings with it the evolution of business processes. Without a doubt, technology changes how work is performed. At first glance, workplace technology appears to be a great boon to society. However, research presents opposing views on how workplace technologies impact the individual. One perspective argues that organizations utilize technology to redesign work processes, such that the worker requires less skill, autonomy, and compensation. The opposing perspective argues that organizations utilize technology to empower employees to improve efficiency and profits. This dissertation consists of three interrelated studies examining workplace technology's impact on decision makers. The first study examines the capability of an enterprise system to increase the application of scientific management techniques to middle management and, consequently, to degrade middle management's work by limiting their autonomy. The second study investigates the capability of an enterprise system to facilitate the empowerment of managers via mutual monitoring and social identification. The third study builds upon the first study by examining how limiting autonomy through technology impacts the intrinsic motivation of decision makers and, as a result, affects the decision making process. Study one applies labor process theory to explain how enterprise systems can degrade the work of middle management via scientific management techniques. The purpose of this study is to test if the expectations of labor process theory can be applied to enterprise systems. In order to test this assertion, a field survey utilizing 189 middle managers is employed and the data is analyzed using component based structural equation modeling. The results indicate that enterprise system integration increases two scientific management techniques, formalization and performance measurement, but do not reveal a significant relationship between enterprise system integration and routinization. Interestingly, the results also indicate that routinization is the only scientific management technique, of the three studied, that directly limits the autonomy of the middle managers. Although performance measurement does not reduce autonomy directly, performance measurement interacts with routinization to reduce autonomy. This study contributes to the enterprise system literature by demonstrating enterprise systems' ability to increase the degree of scientific management applied to middle management. It also contributes to labor process theory by revealing that routinization may be the scientific management technique that determines whether other control techniques are utilized in a manner consistent with labor process theory. The ability of an enterprise system to facilitate the application of Mary Parker Follett's managerial control concepts are investigated in the second study. Specifically, Follett theorizes that information sharing facilitates the internalization of group goals and empowers individuals to have more influence and be more effective. This study employs a survey of 206 managers to test the theoretical relationships. The results indicate that enterprise system integration increases information sharing in the form of mutual monitoring, consequently, leading to social identification among peer managers. Additionally, social identification among peer managers empowers managers to have more influence over the organization. The study contributes to empowerment research by acknowledging and verifying the role that social identification plays in translating an empowering work climate into empowered managers. The study's conclusion that enterprise system integration facilitates the application of Follett's managerial control concepts extends both enterprise system and managerial control literature. The third study builds upon study one by examining the affect that autonomy has upon the decision maker. This study marries self-determination theory and technology dominance theory to understand the role that self-determination, intrinsic motivation, and engagement have upon technology dominance. Self-determination theory asserts that higher degrees of self-determination increase intrinsic motivation. Furthermore, self-determination research finds that intrinsic motivation increases engagement, while technology dominance research indicates that lack of engagement is an antecedent of technology dominance. Thus, applying self-determination theory as a predictor of technology dominance suggests that autonomy and relatedness associated with a task increase the intrinsic motivation to complete that task and consequently increase engagement in the task. Task engagement, in turn, reduces the likelihood of technology dominance. The proposed theoretical model is tested experimentally with 83 junior level business students. The results do not support the theoretical model, however the findings reveal that intrinsic motivation does reduce the likelihood of technology dominance. This indicates that intrinsic motivation as a predictor of technology dominance should be further investigated. Additionally, the study contributes to technology dominance literature by exhibiting a more appropriate operationalization of the inappropriate reliance aspect of technology dominance. This dissertation reveals that various theories concerning workplace technology and management control techniques have both validity and limitations. Labor process theorists cannot assume that all technologies and management control techniques are utilized to undermine the employee's value to the organization, as Study 2 reveals that enterprise systems and mutual monitoring lead to empowered managers. Likewise, proponents of enterprise systems cannot assume that the integrated nature of enterprise systems is always utilized in an empowering manner, as Study 1 reveals the increased performance measurement through enterprise systems can be utilized to limit managers in a routinized job environment. While the third study was unable to determine that the control features in technology affect the intrinsic motivation to complete a task, the findings do reveal that intrinsic motivation is directly related to technology dominance. The findings and theoretical refinements demonstrate that workplace technology and management control have a complicated relationship with the employee and that the various theories concerning them cannot be applied universally.
Show less - Date Issued
- 2013
- Identifier
- CFE0004980, ucf:49569
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004980
- Title
- Performing Jason Robert Brown's The Last Five Years: An Exercise in Communication On Stage and Off.
- Creator
-
Sucharski, David, Niess, Christopher, Weaver, Earl, Boyde, Melinda, University of Central Florida
- Abstract / Description
-
Communication, in its most basic sense, is foundational for any personal, human interaction and relationship. As theatre artists, we are charged with communicating complex story lines, conceptual ideas, and emotion to an audience. Sound communication is paramount to every aspect of a musical production, be it communication between actors/characters, actor and director, amongst the production team, and arguable the most important, between the actors and the audience. My years of education as a...
Show moreCommunication, in its most basic sense, is foundational for any personal, human interaction and relationship. As theatre artists, we are charged with communicating complex story lines, conceptual ideas, and emotion to an audience. Sound communication is paramount to every aspect of a musical production, be it communication between actors/characters, actor and director, amongst the production team, and arguable the most important, between the actors and the audience. My years of education as a Masters in Fine Arts candidate in Musical Theatre have been spent polishing my ability to communicate physical and emotional choices with greater accuracy, depth, and truth. By staging Jason Robert Brown's musical The Last Five Years and performing the role of Jamie, this performance thesis will explore, develop, and examine my mastery of the aforementioned varied forms of communication, all of which are necessary in building a successful musical production. Research will be conducted to gather information on relevant topics, including the history of The Last Five Years, the life of Jason Robert Brown, and his musical and theatrical influences. By further understanding Brown, his life, and his ideas about his works, I hope to more fully understand and communicate the message of the musical itself. A dramatic and musical structural analysis will provide further depth and insight into the piece, with the hopes of informing my production and individual performance. A thorough character analysis will provide connective tissue that will allow myself, as the actor, to more effectively communicate the psychological and emotional make up of the character Jamie. Lastly, the thesis document will culminate with a production journal, documenting the pre-production, rehearsal, and performance process. Through the journaling process, I will document and address the journey that I have experienced with the production, giving focus and attention to its many obstacles and discoveries, successes and failures, all of which have contributed to my personal growth as a young theatre artist.
Show less - Date Issued
- 2012
- Identifier
- CFE0004324, ucf:49465
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004324
- Title
- A SYSTEMATIC ANALYSIS TO IDENTIFY, MITIGATE, QUANTIFY, AND MEASURE RISK FACTORS CONTRIBUTING TO FALLS IN NASA GROUND SUPPORT OPERATIONS.
- Creator
-
Ware , Joylene, Bush , Pamela, University of Central Florida
- Abstract / Description
-
The objective of the research was to develop and validate a multifaceted model such as a fuzzy Analytical Hierarchy Process (AHP) model that considers both qualitative and quantitative elements with relative significance in assessing the likelihood of falls and aid in the design of NASA Ground Support Operations in aerospace environments. The model represented linguistic variables that quantified significant risk factor levels. Multiple risk factors that contribute to falls in NASA Ground...
Show moreThe objective of the research was to develop and validate a multifaceted model such as a fuzzy Analytical Hierarchy Process (AHP) model that considers both qualitative and quantitative elements with relative significance in assessing the likelihood of falls and aid in the design of NASA Ground Support Operations in aerospace environments. The model represented linguistic variables that quantified significant risk factor levels. Multiple risk factors that contribute to falls in NASA Ground Support Operations are task related, human/personal, environmental, and organizational. Six subject matter experts were asked to participate in a voting system involving a survey where they judge risk factors using the fundamental pairwise comparison scale. The results were analyzed and synthesize using Expert Choice Software, which produced the relative weights for the risk factors. The following are relative weights for these risk factors: Task Related (0.314), Human/Personal (0.307), Environmental (0.248), and Organizational (0.130). The overall inconsistency ratio for all risk factors was 0.07, which indicates the model results were acceptable. The results show that task related risk factors are the highest cause for falls and the organizational risk are the lowest cause for falls in NASA Ground Support Operations. The multiple risk factors weights were validated by having two teams of subject matter experts create priority vectors separately and confirm the weights are valid. The fuzzy AHP model usability was utilizing fifteen subjects in a repeated measures analysis. The subjects were asked to evaluate three scenarios in NASA KSC Ground Support Operations regarding various case studies and historical data. The three scenarios were Shuttle Landing Facility (SLF), Launch Complex Payloads (LCP), and Vehicle Assembly Building (VAB). The Kendall Coefficient of Concordance for assessment agreement between and within the subjects was 1.00. Therefore, the appraisers are applying essentially the same standard when evaluating the scenarios. In addition, a NASA subject matter expert was requested to evaluate the three scenarios also. The predicted value was compared to accepted value. The results from the subject matter expert for the model usability confirmed that the predicted value and accepted value for the likelihood rating were similar. The percentage error for the three scenarios was 0%, 33%, 0% respectively. Multiple descriptive statistics for a 95% confidence interval and t-test are the following: coefficient of variation (21.36), variance (0.251), mean (2.34), and standard deviation (0.501). Model validation was the guarantee of agreement with the NASA standard. Model validation process was partitioned into three components: reliability, objectivity, and consistency. The model was validated by comparing the fuzzy AHP model to NASA accepted model. The results indicate there was minimal variability with fuzzy AHP modeling. As a result, the fuzzy AHP model is confirmed valid. Future research includes developing fall protection guidelines.
Show less - Date Issued
- 2009
- Identifier
- CFE0002789, ucf:48094
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0002789
- Title
- CONTROLLED ASSEMBLY AND ELECTRONIC TRANSPORT STUDIES OF SOLUTION PROCESSED CARBON NANOTUBE DEVICES.
- Creator
-
Stokes, Paul, Khondaker, Saiful I., University of Central Florida
- Abstract / Description
-
Developing techniques for the parallel fabrication of Complementary Metal Oxide Semiconductor (CMOS) compatible single walled carbon nanotube (SWNT) electronic devices is of great importance for nanoelectronic applications. In this thesis, solution processed SWNTs in combination with AC dielectrophoresis (DEP) were utilized to fabricate CMOS compatible SWNT field effect transistors (FETs) and single electron transistors (SETs) with high yield and their detailed electronic transport properties...
Show moreDeveloping techniques for the parallel fabrication of Complementary Metal Oxide Semiconductor (CMOS) compatible single walled carbon nanotube (SWNT) electronic devices is of great importance for nanoelectronic applications. In this thesis, solution processed SWNTs in combination with AC dielectrophoresis (DEP) were utilized to fabricate CMOS compatible SWNT field effect transistors (FETs) and single electron transistors (SETs) with high yield and their detailed electronic transport properties were studied. Solution processing of SWNTs is attractive not only for the high throughput and parallel manufacturing of SWNT devices but also due to the ease of processing at room temperature, and compatibility with various substrates. However, it is generally believed that solution processing introduces defects and can degrade electronic transport properties. The results presented in this dissertation show that devices assembled from stable solutions of SWNT can give rise to high quality FET devices at room temperature and relatively clean SET behavior at low temperature. This is a strong indication that there are no or few intrinsic defects in the SWNTs. The dissertation will also discuss the controlled fabrication of size tunable SWNT SET devices using a novel mechanical template approach which offers a route towards the parallel fabrication of room temperature SET devices. The approach is based on the formation of two tunnel barriers created in a SWNT a distance L apart by bending the SWNT at the edge of a local Al/Al2O3 bottom gate. The local gate tunes individual electrons one by one in the device and defines the size of the quantum dot though its width. By tuning both the back gate and local gate, it is possible to tune the transparency of tunnel barriers and the size of the quantum dot further. Detailed transport spectroscopy of these devices will be presented.
Show less - Date Issued
- 2010
- Identifier
- CFE0003061, ucf:48310
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0003061
- Title
- THE ACQUISITION OF LEXICAL KNOWLEDGE FROM THE WEB FOR ASPECTS OF SEMANTIC INTERPRETATION.
- Creator
-
Schwartz, Hansen, Gomez, Fernando, University of Central Florida
- Abstract / Description
-
This work investigates the effective acquisition of lexical knowledge from the Web to perform semantic interpretation. The Web provides an unprecedented amount of natural language from which to gain knowledge useful for semantic interpretation. The knowledge acquired is described as common sense knowledge, information one uses in his or her daily life to understand language and perception. Novel approaches are presented for both the acquisition of this knowledge and use of the knowledge in...
Show moreThis work investigates the effective acquisition of lexical knowledge from the Web to perform semantic interpretation. The Web provides an unprecedented amount of natural language from which to gain knowledge useful for semantic interpretation. The knowledge acquired is described as common sense knowledge, information one uses in his or her daily life to understand language and perception. Novel approaches are presented for both the acquisition of this knowledge and use of the knowledge in semantic interpretation algorithms. The goal is to increase accuracy over other automatic semantic interpretation systems, and in turn enable stronger real world applications such as machine translation, advanced Web search, sentiment analysis, and question answering. The major contributions of this dissertation consist of two methods of acquiring lexical knowledge from the Web, namely a database of common sense knowledge and Web selectors. The first method is a framework for acquiring a database of concept relationships. To acquire this knowledge, relationships between nouns are found on the Web and analyzed over WordNet using information-theory, producing information about concepts rather than ambiguous words. For the second contribution, words called Web selectors are retrieved which take the place of an instance of a target word in its local context. The selectors serve for the system to learn the types of concepts that the sense of a target word should be similar. Web selectors are acquired dynamically as part of a semantic interpretation algorithm, while the relationships in the database are useful to stand-alone programs. A final contribution of this dissertation concerns a novel semantic similarity measure and an evaluation of similarity and relatedness measures on tasks of concept similarity. Such tasks are useful when applying acquired knowledge to semantic interpretation. Applications to word sense disambiguation, an aspect of semantic interpretation, are used to evaluate the contributions. Disambiguation systems which utilize semantically annotated training data are considered supervised. The algorithms of this dissertation are considered minimally-supervised; they do not require training data created by humans, though they may use human-created data sources. In the case of evaluating a database of common sense knowledge, integrating the knowledge into an existing minimally-supervised disambiguation system significantly improved results -- a 20.5\% error reduction. Similarly, the Web selectors disambiguation system, which acquires knowledge directly as part of the algorithm, achieved results comparable with top minimally-supervised systems, an F-score of 80.2\% on a standard noun disambiguation task. This work enables the study of many subsequent related tasks for improving semantic interpretation and its application to real-world technologies. Other aspects of semantic interpretation, such as semantic role labeling could utilize the same methods presented here for word sense disambiguation. As the Web continues to grow, the capabilities of the systems in this dissertation are expected to increase. Although the Web selectors system achieves great results, a study in this dissertation shows likely improvements from acquiring more data. Furthermore, the methods for acquiring a database of common sense knowledge could be applied in a more exhaustive fashion for other types of common sense knowledge. Finally, perhaps the greatest benefits from this work will come from the enabling of real world technologies that utilize semantic interpretation.
Show less - Date Issued
- 2011
- Identifier
- CFE0003688, ucf:48805
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0003688
- Title
- FABRICATION OF FUNCTIONAL NANOSTRUCTURES USING POLYELECTROLYTE NANOCOMPOSITES AND REDUCED GRAPHENE OXIDE ASSEMBLIES.
- Creator
-
Chunder, Anindarupa, Zhai, Lei, University of Central Florida
- Abstract / Description
-
A wide variety of nanomaterials ranging from polymer assemblies to organic and inorganic nanostructures (particles, wires, rods etc) have been actively pursued in recent years for various applications. The synthesis route of these nanomaterials had been driven through two fundamental approaches - ÃÂ'Top downÃÂ' and ÃÂ'Bottom upÃÂ'. The key aspect of their application remained in the ability to make the...
Show moreA wide variety of nanomaterials ranging from polymer assemblies to organic and inorganic nanostructures (particles, wires, rods etc) have been actively pursued in recent years for various applications. The synthesis route of these nanomaterials had been driven through two fundamental approaches - ÃÂ'Top downÃÂ' and ÃÂ'Bottom upÃÂ'. The key aspect of their application remained in the ability to make the nanomaterials suitable for targeted location by manipulating their structure and functionalizing with active target groups. Functional nanomaterials like polyelectrolyte based multilayered thin films, nanofibres and graphene based composite materials are highlighted in the current research. Multilayer thin films were fabricated by conventional dip coating and newly developed spray coating techniques. Spray coating technique has an advantage of being applied for large scale production as compared to the dip coating technique. Conformal hydrophobic/hydrophilic and superhydrophobic/hydrophilic thermal switchable surfaces were fabricated with multilayer films of poly(allylaminehydrochloride) (PAH) and silica nanoparticles by the dip coating technique, followed by the functionalization with thermosensitive polymer-poly(N-isopropylacrylamide)(PNIPAAM) and perfluorosilane. The thermally switchable superhydrophobic/ hydrophilic polymer patch was integrated in a microfluidic channel to act as a stop valve. At 70 degree centigrade, the valve was superhydrophobic and stopped the water flow (close status) while at room temperature, the patch became hydrophilic, and allowed the flow (open status). Spray-coated multilayered film of poly(allylaminehydrochloride) (PAH) and silica nanoparticles was fabricated on polycarbonate substrate as an anti-reflection (AR) coating. The adhesion between the substrate and the coating was enhanced by treating the polycarbonate surface with aminopropyltrimethoxylsilane (APTS) and sol-gel. The coating was finally made abrasion-resistant with a further sol-gel treatment on top of AR coating, which formed a hard thin scratch-resistant film on the coating. The resultant AR coating could reduce the reflection from 5 to 0.3% on plastic. Besides multilayered films, the fabrication of polyelectrolyte based electrospun nanofibers was also explored. Ultrathin nanofibers comprising 2-weak polyelectrolytes, poly(acrylic acid) (PAA) and poly(allylaminehydrochloride) (PAH) were fabricated using the electrospinning technique and methylene blue (MB) was used as a model drug to evaluate the potential application of the fibers for drug delivery. The release of MB was controlled in a nonbuffered medium by changing the pH of the solution. Temperature controlled release of MB was obtained by depositing temperature sensitive PAA/poly(N-isopropylacrylamide) (PNIPAAM) multilayers onto the fiber surfaces. The sustained release of MB in a phosphate buffered saline (PBS) solution was achieved by constructing perfluorosilane networks on the fiber surfaces as capping layers. The fiber was also loaded with a real life anti-depressant drug (2,3-tertbutyl-4-methoxyphenol) and fiber surface was made superhydrophobic. The drug loaded superhydrophobic nanofiber mat was immersed under water, phosphate buffer saline and surfactant solutions in three separated experiments. The rate of release of durg was monitored from the fiber surface as a result of wetting with different solutions. Time dependent wetting of the superhydrophobic surface and consequently the release of drug was studied with different concentrations of surfactant solutions. The results provided important information about the underwater superhydrophobicity and retention time of drug in the nanofibers. The nanostructured polymers like nanowires, nanoribbons and nanorods had several other applications too, based on their structure. Different self-assembled structures of semiconducting polymers showed improved properties based on their architectures. Poly(3-hexylthiophene) (P3HT) supramolecular structures were fabricated on P3HT-dispersed reduced graphene oxide (RGO) nanosheets. P3HT was used to disperse RGO in hot anisole/N, N-dimethylformamide solvents, and the polymer formed nanowires on RGO surfaces through a RGO induced crystallization process. The Raman spectroscopy confirmed the interaction between P3HT and RGO, which allowed the manipulation of the composite's electrical properties. Such a bottom-up approach provided interesting information about graphene-based composites and inspired to study the interaction between RGO and the molecular semiconductor-tetrasulphonate salt of copper phthalocyanine (TSCuPc) for nanometer-scale electronics. The reduction of graphene oxide in presence of TSCuPc produced a highly stabilized aqueous composite ink with monodispersed graphene sheets. To demonstrate the potential application of the donor (TSCuPc)ÃÂacceptor (graphene) composite, the RGO/TSCuPc suspension was successfully incorporated in a thin film device and the optoelectronic property was measured. The conductivity (dark current) of the composite film decreased compared to that of pure graphene due to the donor molecule incorporation, but the photoconductivity and photoresponsivity increased to an appreciable extent. The property of the composite film overall improved with thermal annealing and optimum loading of TSCuPc molecules.
Show less - Date Issued
- 2010
- Identifier
- CFE0003292, ucf:48509
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0003292
- Title
- BIOSIGNAL PROCESSING CHALLENGES IN EMOTION RECOGNITIONFOR ADAPTIVE LEARNING.
- Creator
-
Vartak, Aniket, Mikhael, Wasfy, University of Central Florida
- Abstract / Description
-
User-centered computer based learning is an emerging field of interdisciplinary research. Research in diverse areas such as psychology, computer science, neuroscience and signal processing is making contributions the promise to take this field to the next level. Learning systems built using contributions from these fields could be used in actual training and education instead of just laboratory proof-of-concept. One of the important advances in this research is the detection and assessment of...
Show moreUser-centered computer based learning is an emerging field of interdisciplinary research. Research in diverse areas such as psychology, computer science, neuroscience and signal processing is making contributions the promise to take this field to the next level. Learning systems built using contributions from these fields could be used in actual training and education instead of just laboratory proof-of-concept. One of the important advances in this research is the detection and assessment of the cognitive and emotional state of the learner using such systems. This capability moves development beyond the use of traditional user performance metrics to include system intelligence measures that are based on current neuroscience theories. These advances are of paramount importance in the success and wide spread use of learning systems that are automated and intelligent. Emotion is considered an important aspect of how learning occurs, and yet estimating it and making adaptive adjustments are not part of most learning systems. In this research we focus on one specific aspect of constructing an adaptive and intelligent learning system, that is, estimation of the emotion of the learner as he/she is using the automated training system. The challenge starts with the definition of the emotion and the utility of it in human life. The next challenge is to measure the co-varying factors of the emotions in a non-invasive way, and find consistent features from these measures that are valid across wide population. In this research we use four physiological sensors that are non-invasive, and establish a methodology of utilizing the data from these sensors using different signal processing tools. A validated set of visual stimuli used worldwide in the research of emotion and attention, called International Affective Picture System (IAPS), is used. A dataset is collected from the sensors in an experiment designed to elicit emotions from these validated visual stimuli. We describe a novel wavelet method to calculate hemispheric asymmetry metric using electroencephalography data. This method is tested against typically used power spectral density method. We show overall improvement in accuracy in classifying specific emotions using the novel method. We also show distinctions between different discrete emotions from the autonomic nervous system activity using electrocardiography, electrodermal activity and pupil diameter changes. Findings from different features from these sensors are used to give guidelines to use each of the individual sensors in the adaptive learning environment.
Show less - Date Issued
- 2010
- Identifier
- CFE0003301, ucf:48503
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0003301
- Title
- The Design of a Digital Data Acquisition System for Jet Engine Testing.
- Creator
-
Carter, Robert W., null, null, Engineering
- Abstract / Description
-
Florida Technological University College of Engineering Thesis; This research report documents the various types of Data Acquisition Systems in use for testing jet aircraft engines. The cost trade offs and design considerations are explored for systems which employ a digital computer as the prime recording/processing element. The digital computer has revolutionalized the data acquisition field, particularly in the testing of high performance jet engines. Test data can be acquired, processed,...
Show moreFlorida Technological University College of Engineering Thesis; This research report documents the various types of Data Acquisition Systems in use for testing jet aircraft engines. The cost trade offs and design considerations are explored for systems which employ a digital computer as the prime recording/processing element. The digital computer has revolutionalized the data acquisition field, particularly in the testing of high performance jet engines. Test data can be acquired, processed, converted to engineering units, and out via high speed line printers and cathode ray tubes (CRT's). The data acquisition system operates on-line, and interleaves the random requests for data from multiple test cells by using a specially designed software system and multi-processing capability of the high speed digital computer. All test data must be traceable to The National Bureau of Standards, which required that all calibration standards also be traceable. Primary and secondary calibration methods are discussed and examples of the mathematical processes for conversion of the raw data to meaningful results are presented. Data Acquisition Systems for jet engine testing can be logically grouped into two main categories, with the determining factor being the type of test to be conducted. Production engine testing requires rapid setup, calibration, and fast data turn around, particularly for modern automated test facilities. Development engine testing requires a large number of data channels, infrequent setup, and complete software for extensive engine performance calculations. Both types of Data Acquistion Systems have been designed and built by Pratt and Whitney Aircraft and are used as examples of the techniques described in this report.
Show less - Date Issued
- 1973
- Identifier
- CFR0004770, ucf:52982
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFR0004770
- Title
- More is not always better: Unpacking the cognitive process underlying introspective psychological measurement.
- Creator
-
Lapalme, Matthew, Wang, Wei, Fritzsche, Barbara, Jentsch, Florian, University of Central Florida
- Abstract / Description
-
For decades, psychometricans have measured non-cognitive constructs with little attention paid to the underlying cognitive processes of response. Previous advancement in psychometrics suggests that traditional cognitive oriented approaches may, in fact, yield construct deficiency and spurious results when applied to non-cognitive measurement. This thesis highlights the importance of specifying an ideal point response process for non-cognitive measurement and empirically demonstrates that an...
Show moreFor decades, psychometricans have measured non-cognitive constructs with little attention paid to the underlying cognitive processes of response. Previous advancement in psychometrics suggests that traditional cognitive oriented approaches may, in fact, yield construct deficiency and spurious results when applied to non-cognitive measurement. This thesis highlights the importance of specifying an ideal point response process for non-cognitive measurement and empirically demonstrates that an ideal point response processes undergirds self-reported personality and attitude measurement. Furthermore, this thesis also advances current understanding on the limitations of ideal point assumptions by exploring the moderating effects of various individual differences in motivation and ability.
Show less - Date Issued
- 2015
- Identifier
- CFE0006223, ucf:51074
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006223
- Title
- The impact of a strengths-based group counseling intervention on LGBTQ+ young adult's coping, social support, and coming out growth.
- Creator
-
Ali, Shainna, Lambie, Glenn, Barden, Sejal, Clark, M. H., Vaughan, Michelle, University of Central Florida
- Abstract / Description
-
Lesbian, gay, bisexual, transgender, and queer individuals, and those who otherwise identify as a minority in terms of affectional orientation and gender expression identity (LGBTQ+) have a higher rate of mental health concerns than their heterosexual and cisgender counterparts (Meyer, 2003). Young adulthood is a difficult time for individuals who identify as LGBTQ+ as internal identity development processes coincide with stressors from the outside world. The conflict between intrapersonal...
Show moreLesbian, gay, bisexual, transgender, and queer individuals, and those who otherwise identify as a minority in terms of affectional orientation and gender expression identity (LGBTQ+) have a higher rate of mental health concerns than their heterosexual and cisgender counterparts (Meyer, 2003). Young adulthood is a difficult time for individuals who identify as LGBTQ+ as internal identity development processes coincide with stressors from the outside world. The conflict between intrapersonal and interpersonal pressures may evoke a multitude of negative emotions such as anxiety, loneliness, isolation, fear, anger, resentment, shame, guilt, and fear. One difficult task that triggers these depreciating sentiments is the task of managing the process of coming out during LGBTQ+ young adulthood. The tumultuous, transformative coming out process prompts stressors that may cause the increase of mental health concerns for the LGBTQ+ population. Although counselors recognize the need and lack of counselor competency to assist LGBTQ+ individuals, there is limited (a) client-based outcome research and (b) intervention research to assert the efficacy of methods to assist LGBTQ+ young adults during the coming out process. Specifically, no studies were found that examined the efficacy of a group counseling intervention to assist LGBTQ+ young adults through the coming out process.The purpose of this study was to investigate the impact of a strengths-based coming out group counseling intervention on LGBTQ+ young adults' (ages 18-24) levels of coping, appraisal of social support, and coming out growth. In an effort to contribute to the knowledgebase in the fields of counseling and counselor education, the researcher examined (a) if a strengths-based group counseling intervention influences LGBTQ+ young adults' levels of coping (as measured by the Brief COPE [Carver, 1997]), social support (as measured by the Social Support Questionnaire-6 [Sarason, Sarason, Shearin, (&) Pierce, 1987]), and coming out growth (as measured by the Coming Out Growth Scale [Vaughan (&) Waehler, 2010]) over time; (b) the potential relationship between the outcome variables and group therapeutic factors (Therapeutic Factors Inventory(-)Short Form [TFI-S]; Joyce et al., 2011); and (c) the potential relationship between the outcome variables and the participants' demographic data (e.g., age, affectional orientation, level of outness).A one-group, pretest-posttest quasi-experimental design was utilized in this study. Participants received an eight-hour group counseling intervention divided in to four two-hour sessions. The counseling groups were offered at the University of Central Florida's Community Counseling and Research Center (CCRC). There were three data collection points: (a) prior to the first session, (b) after the second session, and (c) at the end of the last session. The final sample size included 26 LGBTQ+ participants. The research questions were examined using: (a) Repeated Measures Multivariate Analysis of Variance (RM-MANOVA), (b) MANOVA, (c) Canonical correlation, (d) Analysis of Variance (ANOVA), (e) Pearson Product Moment Correlations, and (f) Cronbach's alpha reliability analysis.The RM-MANOVA results identified a multivariate within-subjects effect across time (Wilks' ? = .15; F (12, 14) = 6.77, p (<) .001) and 84% of the variance was accounted for by this effect. Analysis of univariate tests indicated that Social Support Number (F [1.63, 68.18] = 13.94, p (<) .01; partial ?(&)#178; = .25), Social Support Satisfaction (F [2, 50] = 10.35, p (<) .001; partial ?(&)#178; = .29), Individualistic Growth (F [2, 50] = 8.22, p (<) .01; partial ?(&)#178; = .25), and Collectivistic Growth (F [2, 50] = 9.85, p (<) .001; partial ?(&)#178; = .28) exhibited change over time. Additionally, relationships were identified between the outcome variables of Individualistic Growth, Adaptive Coping, and Collectivistic Growth and the group therapeutic factors of Secure Emotional Expression, Awareness of Relational Impact, and Social Learning. Furthermore, age of questioning was positively correlated with Collectivistic Growth.In addition to a literature review, the research methods and statistical results are provided. Results of the investigation are reviewed and compared to previous research findings. Further, areas for future research, limitations of the study, and implications for the counseling and counselor education are presented. Implications of the study's findings include: (a) support for the use of a strengths-based group counseling intervention in order to increase social support and coming out growth in LGBTQ+ young adults, (b) empirical evidence of a counseling strategy promoting positive therapeutic outcomes with LGBTQ+ college age clients, and (c) verification of the importance of group therapeutic factors in effective group counseling interventions.
Show less - Date Issued
- 2016
- Identifier
- CFE0006066, ucf:50991
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006066
- Title
- An Integrated Framework for Automated Data Collection and Processing for Discrete Event Simulation Models.
- Creator
-
Rodriguez, Carlos, Kincaid, John, Karwowski, Waldemar, O'Neal, Thomas, Kaup, David, Mouloua, Mustapha, University of Central Florida
- Abstract / Description
-
Discrete Events Simulation (DES) is a powerful tool of modeling and analysis used in different disciplines. DES models require data in order to determine the different parameters that drive the simulations. The literature about DES input data management indicates that the preparation of necessary input data is often a highly manual process, which causes inefficiencies, significant time consumption and a negative user experience.The focus of this research investigation is addressing the manual...
Show moreDiscrete Events Simulation (DES) is a powerful tool of modeling and analysis used in different disciplines. DES models require data in order to determine the different parameters that drive the simulations. The literature about DES input data management indicates that the preparation of necessary input data is often a highly manual process, which causes inefficiencies, significant time consumption and a negative user experience.The focus of this research investigation is addressing the manual data collection and processing (MDCAP) problem prevalent in DES projects. This research investigation presents an integrated framework to solve the MDCAP problem by classifying the data needed for DES projects into three generic classes. Such classification permits automating and streamlining the preparation of the data, allowing DES modelers to collect, update, visualize, fit, validate, tally and test data in real-time, by performing intuitive actions. In addition to the proposed theoretical framework, this project introduces an innovative user interface that was programmed based on the ideas of the proposed framework. The interface is called DESI, which stands for Discrete Event Simulation Inputs.The proposed integrated framework to automate DES input data preparation was evaluated against benchmark measures presented in the literature in order to show its positive impact in DES input data management. This research investigation demonstrates that the proposed framework, instantiated by the DESI interface, addresses current gaps in the field, reduces the time devoted to input data management within DES projects and advances the state-of-the-art in DES input data management automation.
Show less - Date Issued
- 2015
- Identifier
- CFE0005878, ucf:50861
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005878
- Title
- Signal processing with Fourier analysis, novel algorithms and applications.
- Creator
-
Syed, Alam, Foroosh, Hassan, Sun, Qiyu, Bagci, Ulas, Rahnavard, Nazanin, Atia, George, Katsevich, Alexander, University of Central Florida
- Abstract / Description
-
Fourier analysis is the study of the way general functions may be represented or approximatedby sums of simpler trigonometric functions, also analogously known as sinusoidal modeling. Theoriginal idea of Fourier had a profound impact on mathematical analysis, physics, and engineeringbecause it diagonalizes time-invariant convolution operators. In the past signal processing was atopic that stayed almost exclusively in electrical engineering, where only the experts could cancelnoise, compress...
Show moreFourier analysis is the study of the way general functions may be represented or approximatedby sums of simpler trigonometric functions, also analogously known as sinusoidal modeling. Theoriginal idea of Fourier had a profound impact on mathematical analysis, physics, and engineeringbecause it diagonalizes time-invariant convolution operators. In the past signal processing was atopic that stayed almost exclusively in electrical engineering, where only the experts could cancelnoise, compress and reconstruct signals. Nowadays it is almost ubiquitous, as everyone now dealswith modern digital signals.Medical imaging, wireless communications and power systems of the future will experience moredata processing conditions and wider range of applications requirements than the systems of today.Such systems will require more powerful, efficient and flexible signal processing algorithms thatare well designed to handle such needs. No matter how advanced our hardware technology becomeswe will still need intelligent and efficient algorithms to address the growing demands in signalprocessing. In this thesis, we investigate novel techniques to solve a suite of four fundamentalproblems in signal processing that have a wide range of applications. The relevant equations, literatureof signal processing applications, analysis and final numerical algorithms/methods to solvethem using Fourier analysis are discussed for different applications in the electrical engineering /computer science. The first four chapters cover the following topics of central importance in thefield of signal processing: Fast Phasor Estimation using Adaptive Signal Processing (Chapter 2) Frequency Estimation from Nonuniform Samples (Chapter 3) 2D Polar and 3D Spherical Polar Nonuniform Discrete Fourier Transform (Chapter 4)iv Robust 3D registration using Spherical Polar Discrete Fourier Transform and Spherical Harmonics(Chapter 5)Even though each of these four methods discussed may seem completely disparate, the underlyingmotivation for more efficient processing by exploiting the Fourier domain signal structureremains the same. The main contribution of this thesis is the innovation in the analysis, synthesis, discretization of certain well-known problems like phasor estimation, frequency estimation, computations of a particular non-uniform Fourier transform and signal registration on the transformed domain. We conduct propositions and evaluations of certain applications relevant algorithms suchas, frequency estimation algorithm using non-uniform sampling, polar and spherical polar Fourier transform. The techniques proposed are also useful in the field of computer vision and medical imaging. From a practical perspective, the proposed algorithms are shown to improve the existing solutions in the respective fields where they are applied/evaluated. The formulation and final proposition is shown to have a variety of benefits. Future work with potentials in medical imaging, directional wavelets, volume rendering, video/3D object classifications, high dimensional registration are also discussed in the final chapter. Finally, in the spirit of reproducible research, we release the implementation of these algorithms to the public using Github.
Show less - Date Issued
- 2017
- Identifier
- CFE0006803, ucf:51775
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006803
- Title
- A Systems Approach to Assessing, Interpreting and Applying Human Error Mishap Data to Mitigate Risk of Future Incidents in a Space Exploration Ground Processing Operations Environment.
- Creator
-
Alexander, Tiffaney, McCauley, Pamela, Rabelo, Luis, Karwowski, Waldemar, Nunez, Jose, University of Central Florida
- Abstract / Description
-
Research results have shown that more than half of aviation, aerospace and aeronautics mishaps/incidents are attributed to human error. Although many existing incident report systems have been beneficial for identifying engineering failures, most of them are not designed around a theoretical framework of human error, thus failing to address core issues and causes of the mishaps. Therefore, it is imperative to develop a human error assessment framework to identify these causes. This research...
Show moreResearch results have shown that more than half of aviation, aerospace and aeronautics mishaps/incidents are attributed to human error. Although many existing incident report systems have been beneficial for identifying engineering failures, most of them are not designed around a theoretical framework of human error, thus failing to address core issues and causes of the mishaps. Therefore, it is imperative to develop a human error assessment framework to identify these causes. This research focused on identifying causes of human error and leading contributors to historical Launch Vehicle Ground Processing Operations mishaps based on past mishaps, near mishaps, and close calls. Three hypotheses were discussed. The first hypothesis addressed the impact Human Factor Analysis and Classification System (HFACS) contributing factors (unsafe acts of operators, preconditions for unsafe acts, unsafe supervision, and/or organizational influences) have on human error events (i.e. mishaps, close calls, incident or accidents) in NASA Ground Processing Operations. The second hypothesis focused on determining if the HFACS framework conceptual model could be proven to be a viable analysis and classification system to help classify both latent and active underlying contributors and causes of human error in ground processing operations. Lastly, the third hypothesis focused on determining if the development of a model using the Human Error Assessment and Reduction Technique (HEART) could be used as a tool to help determine the probability of human error occurrence in ground processing operations. A model to analyze and classify contributing factors to mishaps or incidents, and generate predicted Human Error Probabilities (HEPs) of future occurrence was developed using the HEART and HFACS tools. The research methodology was applied (retrospectively) to six Ground Processing Operations (GPO) Scenarios and 30 years of Launch Vehicle Related Mishap Data. Surveys were used to provide Subject Matter Experts' (SMEs) subjective assessments of the impact Error Producing Conditions (EPC) had on specific tasks. In this research a Logistic Binary Regression model, which identified the four most significant contributing HFACS human error factors was generated. This model provided predicted probabilities of future occurrence of mishaps when these contributing factors are present. The results showed that the HEART and HFACS methods, when modified, can be used as an analysis tool to identify contributing factors, their impact on human error events, and predict the potential probability of future human error occurrence. This methodology and framework was validated through consistency and comparison to other related research. A contribution methodology for other space operations and similar complex operations to follow was provided from this research. Future research should involve broadening the scope to explore and identify other existing models of human error management systems to integrate into complex space systems beyond what was conducted in this research.
Show less - Date Issued
- 2016
- Identifier
- CFE0006829, ucf:51795
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006829
- Title
- Applied Advanced Error Control Coding for General Purpose Representation and Association Machine Systems.
- Creator
-
Dai, Bowen, Wei, Lei, Lin, Mingjie, Rahnavard, Nazanin, Turgut, Damla, Sun, Qiyu, University of Central Florida
- Abstract / Description
-
General-Purpose Representation and Association Machine (GPRAM) is proposed to be focusing on computations in terms of variation and flexibility, rather than precision and speed. GPRAM system has a vague representation and has no predefined tasks. With several important lessons learned from error control coding, neuroscience and human visual system, we investigate several types of error control codes, including Hamming code and Low-Density Parity Check (LDPC) codes, and extend them to...
Show moreGeneral-Purpose Representation and Association Machine (GPRAM) is proposed to be focusing on computations in terms of variation and flexibility, rather than precision and speed. GPRAM system has a vague representation and has no predefined tasks. With several important lessons learned from error control coding, neuroscience and human visual system, we investigate several types of error control codes, including Hamming code and Low-Density Parity Check (LDPC) codes, and extend them to different directions.While in error control codes, solely XOR logic gate is used to connect different nodes. Inspired by bio-systems and Turbo codes, we suggest and study non-linear codes with expanded operations, such as codes including AND and OR gates which raises the problem of prior-probabilities mismatching. Prior discussions about critical challenges in designing codes and iterative decoding for non-equiprobable symbols may pave the way for a more comprehensive understanding of bio-signal processing. The limitation of XOR operation in iterative decoding with non-equiprobable symbols is described and can be potentially resolved by applying quasi-XOR operation and intermediate transformation layer. Constructing codes for non-equiprobable symbols with the former approach cannot satisfyingly perform with regarding to error correction capability. Probabilistic messages for sum-product algorithm using XOR, AND, and OR operations with non-equiprobable symbols are further computed. The primary motivation for the constructing codes is to establish the GPRAM system rather than to conduct error control coding per se. The GPRAM system is fundamentally developed by applying various operations with substantial over-complete basis. This system is capable of continuously achieving better and simpler approximations for complex tasks.The approaches of decoding LDPC codes with non-equiprobable binary symbols are discussed due to the aforementioned prior-probabilities mismatching problem. The traditional Tanner graph should be modified because of the distinction of message passing to information bits and to parity check bits from check nodes. In other words, the message passing along two directions are identical in conventional Tanner graph, while the message along the forward direction and backward direction are different in our case. A method of optimizing signal constellation is described, which is able to maximize the channel mutual information.A simple Image Processing Unit (IPU) structure is proposed for GPRAM system, to which images are inputted. The IPU consists of a randomly constructed LDPC code, an iterative decoder, a switch, and scaling and decision device. The quality of input images has been severely deteriorated for the purpose of mimicking visual information variability (VIV) experienced in human visual systems. The IPU is capable of (a) reliably recognizing digits from images of which quality is extremely inadequate; (b) achieving similar hyper-acuity performance comparing to human visual system; and (c) significantly improving the recognition rate with applying randomly constructed LDPC code, which is not specifically optimized for the tasks.
Show less - Date Issued
- 2016
- Identifier
- CFE0006449, ucf:51413
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006449
- Title
- Quality by Design Procedure for Continuous Pharmaceutical Manufacturing: An Integrated Flowsheet Model Approach.
- Creator
-
Vezina, Ashley, Elshennawy, Ahmad, Rabelo, Luis, Karwowski, Waldemar, University of Central Florida
- Abstract / Description
-
Pharmaceutical manufacturing is crucial to global healthcare and requires a higher, more consistent level of quality than any other industry. Yet, the traditional pharmaceutical batch manufacturing has remained largely unchanged in the last fifty years due to high R(&)D costs, shorter patent durations, and regulatory uncertainty. This has led regulatory bodies to promote modernization of manufacturing process to continuous pharmaceutical manufacturing (CPM) by introducing new methodologies...
Show morePharmaceutical manufacturing is crucial to global healthcare and requires a higher, more consistent level of quality than any other industry. Yet, the traditional pharmaceutical batch manufacturing has remained largely unchanged in the last fifty years due to high R(&)D costs, shorter patent durations, and regulatory uncertainty. This has led regulatory bodies to promote modernization of manufacturing process to continuous pharmaceutical manufacturing (CPM) by introducing new methodologies including quality by design, design space, and process analytical technology (PAT). This represents a shift away from the traditional pharmaceutical manufacturing way of thinking towards a risk based approach that promotes increased product and process knowledge through a data-rich environment. While both literature and regulatory bodies acknowledge the need for modernization, manufacturers have been slow to modernize due to uncertainty and lack of confidence in the applications of these methodologies. This paper aims to describe the current applications of QbD principles in literature and the current regulatory environment to identify gaps in literature through leveraging regulatory guidelines and CPM literature. To aid in closing the gap between QbD theory and QbD application, a QbD algorithm for CPM using an integrated flowsheet models is also developed and analyzed. This will help to increase manufacturing confidence in CPM by providing answers to questions about the CPM business case, applications of QbD tools, process validation and sensitivity, and process and equipment characteristics. An integrated flowsheet model will aid in the decision-making process and process optimization, breaking away from ex silico methods extensively covered in literature.
Show less - Date Issued
- 2017
- Identifier
- CFE0006923, ucf:51683
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006923
- Title
- Brain stethoscope: A non-invasive method for monitoring intracranial pressure.
- Creator
-
Azad, Md Khurshidul, Mansy, Hansen, Kassab, Alain, Bhattacharya, Samik, University of Central Florida
- Abstract / Description
-
Monitoring intracranial pressure (ICP) is important for patients with increased intracranial pressure. Invasive methods of ICP monitoring include lumbar puncture manometry, which requires high precision, is costly, and can lead to complications. Non-invasive monitoring of ICP using tympanic membrane pulse (TMp) measurement can provide an alternative monitoring method that avoids such complications. In the current study, a piezo based sensor was designed, constructed and used to acquire TMp...
Show moreMonitoring intracranial pressure (ICP) is important for patients with increased intracranial pressure. Invasive methods of ICP monitoring include lumbar puncture manometry, which requires high precision, is costly, and can lead to complications. Non-invasive monitoring of ICP using tympanic membrane pulse (TMp) measurement can provide an alternative monitoring method that avoids such complications. In the current study, a piezo based sensor was designed, constructed and used to acquire TMp signals. The results showed that tympanic membrane waveform changed in morphology and amplitude with increased ICP, which was induced by changing subject position using a tilt table. In addition, the results suggest that TMp are affected by breathing, which has small effects on ICP. The newly developed piezo based brain stethoscope may be a way to monitor patients with increased intracranial pressure thus avoiding invasive ICP monitoring and reducing associated risk and cost.
Show less - Date Issued
- 2018
- Identifier
- CFE0006972, ucf:51643
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006972
- Title
- Preschool Attendance: A Parental and Teacher Perspective of Barriers, Health Behaviors and Practices using Grounded Theory Research.
- Creator
-
Meoli, Anne, Chase, Susan, Anderson, Mindi, Quelly, Susan, Wink, Diane, Sheinberg, Nurit, University of Central Florida
- Abstract / Description
-
Background: Preschool children from single-parent households with lower socioeconomic status (SES) are absent from preschool at rates higher than any other group. Some children are chronically absent, missing more than 10% of the school year. The phenomenon of preschool attendance related to behaviors, practices, and parental decision making associated with health and illness in lower SES households has not been previously studied using grounded theory methodology.Aim: The purpose of this...
Show moreBackground: Preschool children from single-parent households with lower socioeconomic status (SES) are absent from preschool at rates higher than any other group. Some children are chronically absent, missing more than 10% of the school year. The phenomenon of preschool attendance related to behaviors, practices, and parental decision making associated with health and illness in lower SES households has not been previously studied using grounded theory methodology.Aim: The purpose of this study was to explore decision making related to supporting attendance in a preschool of 67 children (aged 3 to 4 years) with primarily low-income, single parents and preschool teachers in South Florida. The decision making process parents and teachers face every day and the environmental supports of preschool attendance facilitated identification of factors encouraging or impeding attendance.Results and Recommendations: Focus groups and interviews with teachers, parents and administrators were conducted, and direct observation of the school attendance process and health/attendance policies were examined. Data analysis was concurrent with data collection to allow for theoretical sampling. The data analysis revealed an underlying process of (")communicating about health: benefitting children's attendance in a preschool environment.(") Supporting this theory were three themes of (a) empowerment: actions to support health, (b) trusting judgment regarding health, and (c) commitment of organization and parents to health and attendance. Recommendations for implementation of practice, policy changes, and opportunities for future research found in this unique setting were discussed to improve attendance.
Show less - Date Issued
- 2016
- Identifier
- CFE0006143, ucf:51186
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006143
- Title
- Defining a Stakeholder-Relative Model to Measure Academic Department Efficiency at Achieving Quality in Higher Education.
- Creator
-
Robinson, Federica, Sepulveda, Jose, Reilly, Charles, Nazzal, Dima, Armacost, Robert, Feldheim, Mary, University of Central Florida
- Abstract / Description
-
In a time of strained resources and dynamic environments, the importance of effective and efficient systems is critical. This dissertation was developed to address the need to use feedback from multiple stakeholder groups to define quality and assess an entity's efficiency at achieving such quality.A decision support model with applicability to diverse domains was introduced to outline the approach. Three phases, (1) quality model development, (2) input-output selection and (3) relative...
Show moreIn a time of strained resources and dynamic environments, the importance of effective and efficient systems is critical. This dissertation was developed to address the need to use feedback from multiple stakeholder groups to define quality and assess an entity's efficiency at achieving such quality.A decision support model with applicability to diverse domains was introduced to outline the approach. Three phases, (1) quality model development, (2) input-output selection and (3) relative efficiency assessment, captured the essence of the process which also delineates the approach per tool applied. This decision support model was adapted in higher education to assess academic departmental efficiency at achieving stakeholder-relative quality. Phase 1 was accomplished through a three round, Delphi-like study which involved user group refinement. Those results were compared to the criteria of an engineering accreditation body (ABET) to support the model's validity to capture quality in the College of Engineering (&) Computer Science, its departments and programs. In Phase 2 the Analytic Hierarchy Process (AHP) was applied to the validated model to quantify the perspective of students, administrators, faculty and employers (SAFE). Using the composite preferences for the collective group (n=74), the model was limited to the top 7 attributes which accounted for about 55% of total preferences. Data corresponding to the resulting variables, referred to as key performance indicators, was collected using various information sources and infused in the data envelopment analysis (DEA) methodology (Phase 3). This process revealed both efficient and inefficient departments while offering transparency of opportunities to maximize quality outputs. Findings validate the potential of the Delphi-like, analytic hierarchical, data envelopment analysis approach for administrative decision-making in higher education. However, the availability of more meaningful metrics and data is required to adapt the model for decision making purposes. Several recommendations were included to improve the usability of the decision support model and future research opportunities were identified to extend the analyses inherent and apply the model to alternative areas.
Show less - Date Issued
- 2013
- Identifier
- CFE0004921, ucf:49636
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004921
- Title
- Analytical study of computer vision-based pavement crack quantification using machine learning techniques.
- Creator
-
Mokhtari, Soroush, Yun, Hae-Bum, Nam, Boo Hyun, Catbas, Necati, Shah, Mubarak, Xanthopoulos, Petros, University of Central Florida
- Abstract / Description
-
Image-based techniques are a promising non-destructive approach for road pavement condition evaluation. The main objective of this study is to extract, quantify and evaluate important surface defects, such as cracks, using an automated computer vision-based system to provide a better understanding of the pavement deterioration process. To achieve this objective, an automated crack-recognition software was developed, employing a series of image processing algorithms of crack extraction, crack...
Show moreImage-based techniques are a promising non-destructive approach for road pavement condition evaluation. The main objective of this study is to extract, quantify and evaluate important surface defects, such as cracks, using an automated computer vision-based system to provide a better understanding of the pavement deterioration process. To achieve this objective, an automated crack-recognition software was developed, employing a series of image processing algorithms of crack extraction, crack grouping, and crack detection. Bottom-hat morphological technique was used to remove the random background of pavement images and extract cracks, selectively based on their shapes, sizes, and intensities using a relatively small number of user-defined parameters. A technical challenge with crack extraction algorithms, including the Bottom-hat transform, is that extracted crack pixels are usually fragmented along crack paths. For de-fragmenting those crack pixels, a novel crack-grouping algorithm is proposed as an image segmentation method, so called MorphLink-C. Statistical validation of this method using flexible pavement images indicated that MorphLink-C not only improves crack-detection accuracy but also reduces crack detection time.Crack characterization was performed by analysing imagerial features of the extracted crack image components. A comprehensive statistical analysis was conducted using filter feature subset selection (FSS) methods, including Fischer score, Gini index, information gain, ReliefF, mRmR, and FCBF to understand the statistical characteristics of cracks in different deterioration stages. Statistical significance of crack features was ranked based on their relevancy and redundancy. The statistical method used in this study can be employed to avoid subjective crack rating based on human visual inspection. Moreover, the statistical information can be used as fundamental data to justify rehabilitation policies in pavement maintenance.Finally, the application of four classification algorithms, including Artificial Neural Network (ANN), Decision Tree (DT), k-Nearest Neighbours (kNN) and Adaptive Neuro-Fuzzy Inference System (ANFIS) is investigated for the crack detection framework. The classifiers were evaluated in the following five criteria: 1) prediction performance, 2) computation time, 3) stability of results for highly imbalanced datasets in which, the number of crack objects are significantly smaller than the number of non-crack objects, 4) stability of the classifiers performance for pavements in different deterioration stages, and 5) interpretability of results and clarity of the procedure. Comparison results indicate the advantages of white-box classification methods for computer vision based pavement evaluation. Although black-box methods, such as ANN provide superior classification performance, white-box methods, such as ANFIS, provide useful information about the logic of classification and the effect of feature values on detection results. Such information can provide further insight for the image-based pavement crack detection application.
Show less - Date Issued
- 2015
- Identifier
- CFE0005671, ucf:50186
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005671
- Title
- A Multimedia Approach to Game-Based Training: Exploring the Effects of the Modality and Temporal Contiguity Principles on Learning in a Virtual Environment.
- Creator
-
Serge, Stephen, Mouloua, Mustapha, Bohil, Corey, Bowers, Clint, Priest Walker, Heather, University of Central Florida
- Abstract / Description
-
There is an increasing interest in using video games as a means to deliver training to individuals learning new skills or tasks. However, current research lacks a clear method of developing effective instructional material when these games are used as training tools and explaining how gameplay may affect learning. The literature contains multiple approaches to training and GBT but generally lacks a foundational-level and theoretically relevant approach to how people learn specifically from...
Show moreThere is an increasing interest in using video games as a means to deliver training to individuals learning new skills or tasks. However, current research lacks a clear method of developing effective instructional material when these games are used as training tools and explaining how gameplay may affect learning. The literature contains multiple approaches to training and GBT but generally lacks a foundational-level and theoretically relevant approach to how people learn specifically from video games and how to design instructional guidance within these gaming environments. This study investigated instructional delivery within GBT. Video games are a form of multimedia, consisting of both imagery and sounds. The Cognitive Theory of Multimedia Learning (CTML; Mayer 2005) explicitly describes how people learn from multimedia information, consisting of a combination of narration (words) and animation (pictures). This study empirically examined the effects of the modality and temporal contiguity principles on learning in a game-based virtual environment. Based on these principles, it was hypothesized that receiving either voice or embedded training would result in better performance on learning measures. Additionally, receiving a combination of voice and embedded training would lead to better performance on learning measures than all other instructional conditions.A total of 128 participants received training on the role and procedures related to the combat lifesaver (-) a non-medical soldier who receives additional training on combat-relevant lifesaving medical procedures. Training sessions involved an instructional presentation manipulated along the modality (voice or text) and temporal contiguity (embedded in the game or presented before gameplay) principles. Instructional delivery was manipulated in a 2x2 between-subjects design with four instructional conditions: Upfront-Voice, Upfront-Text, Embedded-Voice, and Embedded-Text. Results indicated that: (1) upfront instruction led to significantly better retention performance than embedded instructional regardless of delivery modality; (2) receiving voice-based instruction led to better transfer performance than text-based instruction regardless of presentation timing; (3) no differences in performance were observed on the simple application test between any instructional conditions; and (4) a significant interaction of modality-by-temporal contiguity was obtained. Simple effects analysis indicated differing effects along modality within the embedded instruction group, with voice recipients performing better than text (p = .012). Individual group comparisons revealed that the upfront-voice group performed better on retention than both embedded groups (p = .006), the embedded-voice group performed better on transfer than the upfront text group (p = .002), and the embedded-voice group performed better on the complex application test than the embedded-text group (p =.012). Findings indicated partial support for the application of the modality and temporal contiguity principles of CTML in interactive GBT. Combining gameplay (i.e., practice) with instructional presentation both helps and hinders working memory's ability to process information. Findings also explain how expanding CTML into game-based training may fundamentally change how a person processes information as a function of the specific type of knowledge being taught. Results will drive future systematic research to test and determine the most effective means of designing instruction for interactive GBT. Further theoretical and practical implications will be discussed.
Show less - Date Issued
- 2014
- Identifier
- CFE0005548, ucf:50271
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005548