Current Search: measures (x)
Pages
-
-
Title
-
The Diffusion of Digital Dashboards: An Examination of Dashboard Utilization and the Managerial Decision Environment.
-
Creator
-
Reinking, Jeffrey, Arnold, Vicky, Roberts, Robin, Sutton, Steven, Hampton, Clark, University of Central Florida
-
Abstract / Description
-
This dissertation consists of three related studies examining the diffusion of digital dashboard technology throughout today's organizations. Dashboards, once reserved for the executive level, are now available to managers at the lower levels of the organization. For these managers, dashboards have become an integral part of their work life to support their decision environment, to provide consistency in measures, to monitor performance, and to communicate information throughout the...
Show moreThis dissertation consists of three related studies examining the diffusion of digital dashboard technology throughout today's organizations. Dashboards, once reserved for the executive level, are now available to managers at the lower levels of the organization. For these managers, dashboards have become an integral part of their work life to support their decision environment, to provide consistency in measures, to monitor performance, and to communicate information throughout the organization. Prior research in the practice literature has shown that dashboards improve managerial performance and organizational performance as well as communicate organizational goals and objectives; however, empirical research has not been conducted in this area to confirm this anecdotal evidence. Using three theories, the phenomenon surrounding the diffusion of dashboards to the lower levels of the organization are examined based on 1) dashboards as a source of interactive management control and strategy alignment, 2) the impact of dashboard quality on strategy alignment, decision environment, and performance, and 3) the impacts on dashboard utilization from the antecedents of information content and task uncertainty and the consequences of user satisfaction and managerial performance. The first study investigates why dashboards have been diffused to the lowers levels of today's organizations. The primary focus of this study is to develop an understanding about the extent of dashboard utilization by decision-makers and the antecedents and consequences of utilization that is responsible for the widespread acceptance of this technology. The data for this study is collected and analyzed through an explanatory cross-sectional field study utilizing a semi-structured questionnaire. Using data from interviews with 27 managers, a framework is developed that indicates strategy alignment and dashboards associated with interactive management control are the primary antecedents that drive dashboard diffusion. The dimensions of dashboard system quality and dashboard information quality mediate the relationship between an interactive dashboard and the extent of dashboard utilization, which leads to higher levels of managerial performance and organizational performance. This study contributes to the dashboard, strategy, and MCS literature by revealing that dashboards are not isolated technologies, rather they play an important role in the execution of strategy at the operational levels of an organization. In addition, dashboards can also function as an interactive management control, which leads to high levels of diffusion of dashboards throughout organizations. Prior strategy literature has examined strategy alignment at the higher levels and this study extends this research stream by investigating strategy alignment at the lower operational levels of the organization.The second study utilizes the IS Success Model to explore the impacts of the antecedents of dashboard system quality and dashboard information quality on the managerial decision environment in addition to the resulting consequences or 'net benefit' of managerial performance and organizational performance. A field survey is used to collect data from 391 dashboard using managers to enable the analysis of the relationships predicted in the theoretical model. The theoretical model is analyzed utilizing PLS. The results show that two dimensions of dashboard quality, system flexibility and information currency, have a positive effect the managerial decision environment. The model indicates support for the consequences of managerial performance and organizational performance resulting from higher levels of decision quality in the managerial decision environment. The model also reveals that when the dashboard measures are strategy aligned, lower levels of dashboard system flexibility are associated with improved managerial decision environment. Therefore, when organizations design their dashboard systems to support strategy alignment, managers should not be afforded high levels of system flexibility to maintain their attention on the key performance indicators selected to align with strategy. This result is a primary contribution to the strategy literature that reveals that strategy aligned dashboards are more effective in environments where the dashboard flexibility is lower. Additionally, study two also extends the strategy literature by examining strategy alignment at the lower levels of the organization, since prior research has concentrated on the higher level strategic outcomes.As dashboards become highly diffused and more managers utilize the technology, the likelihood that dashboard designers cannot provide dashboard content that fits the tasks performed by managers is higher. The third study investigates this fit between dashboard information content and task uncertainty to understand if the fit between the technology and task impacts the extent of dashboard utilization by managers based on the theory of task-technology fit (TTF). TTF predicts higher levels of utilization will increase user satisfaction and managerial performance. Data is collected from 391 managers that utilize dashboards in their weekly work life to analyze the relationships predicted in the theoretical model. PLS is utilized to analyze the theoretical model and indicates weak support of TTF impacting the extent of dashboard utilization. The model supports the hypotheses for the links between the extent of dashboard utilization and user satisfaction and managerial performance. Based on the weak findings from this theoretical model, a second model is developed and analyzed. The second model measures TTF through the mediation of task uncertainty between dashboard information content and the extent of dashboard utilization, while the first model measured TTF through interacting task uncertainty and dashboard information content. The results of the second model show strong support that TTF, as measured through mediation, increases the extent of dashboard utilization. This study contributes to the literature by empirically showing that more extensive levels of dashboard utilization are achieved through the antecedent of TTF, resulting in increased managerial satisfaction and managerial performance.
Show less
-
Date Issued
-
2013
-
Identifier
-
CFE0005052, ucf:49969
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0005052
-
-
Title
-
Injection-Locked Vertical Cavity Surface Emitting Lasers (VCSELs) for Optical Arbitrary Waveform Generation.
-
Creator
-
Bhooplapur, Sharad, Delfyett, Peter, Li, Guifang, Christodoulides, Demetrios, Malocha, Donald, University of Central Florida
-
Abstract / Description
-
Complex optical pulse shapes are typically generated from ultrashort laser pulses by manipulating the optical spectrum of the input pulses. This generates complex but periodic time-domain waveforms. Optical Arbitrary Waveform Generation (OAWG) builds on the techniques of ultrashort pulse?shaping, with the goal of making non?periodic, truly arbitrary optical waveforms. Some applications of OAWG are coherently controlling chemical reactions on a femtosecond time scale, improving the performance...
Show moreComplex optical pulse shapes are typically generated from ultrashort laser pulses by manipulating the optical spectrum of the input pulses. This generates complex but periodic time-domain waveforms. Optical Arbitrary Waveform Generation (OAWG) builds on the techniques of ultrashort pulse?shaping, with the goal of making non?periodic, truly arbitrary optical waveforms. Some applications of OAWG are coherently controlling chemical reactions on a femtosecond time scale, improving the performance of LADAR systems, high?capacity optical telecommunications and ultra wideband signals processing.In this work, an array of Vertical Cavity Surface Emitting Lasers (VCSELs) are used as modulators, by injection-locking each VCSEL to an individual combline from an optical frequency comb source. Injection-locking ensures that the VCSELs' emission is phase coherent with the input combline, and modulating its current modulates mainly the output optical phase. The multi-GHz modulation bandwidth of VCSELs updates the output optical pulse shape on a pulse-to-pulse time scale, which is an important step towards true OAWG. In comparison, it is about a million times faster than the liquid-crystal modulator arrays typically used for pulse shaping! Novel components and subsystems of Optical Arbitrary Waveform Generation (OAWG) are developed and demonstrated in this work. They include:1.Modulators An array of VCSELs is packaged and characterized for use as a modulator for rapid?update pulse?shaping at GHz rates. The amplitude and phase modulation characteristics of an injection?locked VCSEL are simultaneously measured at GHz modulation rates.2.Optical Frequency Comb SourcesAn actively mode?locked semiconductor laser was assembled, with a 12.5 GHz repetition rate, ~ 200 individually resolvable comblines directly out of the laser, and high frequency stability. In addition, optical frequency comb sources are generated by modulation of a single frequency laser.3.High-resolution optical spectral demultiplexersThe demultiplexers are implemented using bulk optics, and are used to spatially resolve individual optical comblines onto the modulator array. 4.Optical waveform measurement techniques Several techniques are used to measure generated waveforms, especially for spectral phase measurements, including multi-heterodyne phase retrieval. In addition, an architecture for discriminating between ultrashort encoded optical pulses with record high sensitivity is demonstrated.
Show less
-
Date Issued
-
2014
-
Identifier
-
CFE0005466, ucf:50402
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0005466
-
-
Title
-
Development of the Taiwanese Mandarin Main Concept Analysis and Linguistic Communication Measure: Normative and Preliminary Aphasic Data.
-
Creator
-
Yeh, Chun-chih, Kong, Pak Hin, Whiteside, Janet, Ryalls, John, University of Central Florida
-
Abstract / Description
-
Aphasia is a language disorder resulting from damage to brain areas that control language expression and reception. Clinically, the narrative production of Persons with Aphasia (PWA) provides valuable information for diagnosis of aphasia. There are several types of assessment procedures for analysis of aphasic's narrative production. One of them is to use quantification systems, such as the Cantonese Linguistic Communication Measure (CLCM; Kong (&) Law, 2004) or the Main Concept Analysis (MCA...
Show moreAphasia is a language disorder resulting from damage to brain areas that control language expression and reception. Clinically, the narrative production of Persons with Aphasia (PWA) provides valuable information for diagnosis of aphasia. There are several types of assessment procedures for analysis of aphasic's narrative production. One of them is to use quantification systems, such as the Cantonese Linguistic Communication Measure (CLCM; Kong (&) Law, 2004) or the Main Concept Analysis (MCA; Kong, 2009), for objective quantification of aphasic's discourse. The purposes of this study are (1) to translate the MCA and CLCM to a Taiwanese Mandarin Main Concept Analysis (TM-MCA) and a Taiwanese Mandarin Linguistic Communication Measure (TM-LCM), respectively, and (2) to validate them based on normal speakers and PWA in Taiwan. In the pilot study, a total of sixteen participants, eight certified speech-language pathologists (SLPs) and eight normal speakers, were invited to establish the Taiwanese Mandarin main concepts related to the four sets of sequencial pictures created by Kong in 2009. The language samples from eight normal speakers were then used to determine the informative words (i-words) in the picture sets. In the main study, thirty-six normal speakers and ten PWA were recruited to perform the same picture description tasks. The elicited language samples were analyzed using both the TM-MCA and TM-LCM. The results suggested that both age and education affected the oral discourse performance. Significant differences on the measures in TM-MCA and indices in TM-LCM were noted between the normal and aphasic groups. It was also found that overall aphasia severity affected the picture description performances of PWA. Finally, significant correlations between some of the TM-MCA measures and TM-LCM indices were noted. In conclusion, both the TM-MCA and TM-LCM are culturally appropriate to the Taiwanese Mandarin population. They can be used to supplement standardized aphasia tests to help clinicians make more informative decisions not only on diagnosis but also on a treatment plan of aphasia.
Show less
-
Date Issued
-
2014
-
Identifier
-
CFE0005281, ucf:50554
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0005281
-
-
Title
-
PREDICTING SCIENCE LITERACY AND SCIENCE APPRECIATION.
-
Creator
-
Hellmuth, Robert, Negy, Charles, University of Central Florida
-
Abstract / Description
-
Research has shown that the benefits of having a populace literate in science are great. Even if citizens are not literate in basic science, it is important that citizens still appreciate science and those with expertise in the field for many reasons. Recent research suggests that the United States (U.S.) has lower levels of science literacy than it should. Evidence may also suggest that many U.S. citizens are not appreciative of science. Overall, little research has been conducted on what...
Show moreResearch has shown that the benefits of having a populace literate in science are great. Even if citizens are not literate in basic science, it is important that citizens still appreciate science and those with expertise in the field for many reasons. Recent research suggests that the United States (U.S.) has lower levels of science literacy than it should. Evidence may also suggest that many U.S. citizens are not appreciative of science. Overall, little research has been conducted on what may predict science literacy and science appreciation which is the aim of this research. Specifically, I have examined socio-personal variables, beliefs, thought paradigms, and various demographic variables that may be predictive of science literacy and science appreciation. Results indicated that scriptural literalism, religiosity, and magical ideation were predictive of low levels of science literacy. In addition, predictors of low levels of science appreciation included scriptural literalism and magical ideation. Implications of the findings are discussed.
Show less
-
Date Issued
-
2014
-
Identifier
-
CFH0004685, ucf:45240
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFH0004685
-
-
Title
-
A Methodology on Weapon Combat Effectiveness Analytics using Big Data and Live, Virtual, or/and Constructive Simulations.
-
Creator
-
Jung, Won Il, Lee, Gene, Rabelo, Luis, Elshennawy, Ahmad, Ahmad, Ali, University of Central Florida
-
Abstract / Description
-
The Weapon Combat Effectiveness (WCE) analytics is very expensive, time-consuming, and dangerous in the real world because we have to create data from the real operations with a lot of people and weapons in the actual environment. The Modeling and Simulation (M(&)S) of many techniques is used for overcoming these limitations. Although the era of big data has emerged and achieved a great deal of success in a variety of fields, most WCE research using the Defense Modeling and Simulation (DM(&)S...
Show moreThe Weapon Combat Effectiveness (WCE) analytics is very expensive, time-consuming, and dangerous in the real world because we have to create data from the real operations with a lot of people and weapons in the actual environment. The Modeling and Simulation (M(&)S) of many techniques is used for overcoming these limitations. Although the era of big data has emerged and achieved a great deal of success in a variety of fields, most WCE research using the Defense Modeling and Simulation (DM(&)S) techniques were studied without the help of big data technologies and techniques. The existing research has not considered various factors affecting WCE. This is because current research has been restricted by only using constructive simulation, a single weapon system, and limited scenarios. Therefore, the WCE analytics using existing methodologies have also incorporated the same limitations, and therefore, cannot help but get biased results.To solve the above problem, this dissertation is to initially review and compose the basic knowledge for the new WCE analytics methodology using big data and DM(&)S to further serve as the stepping-stone of the future research for the interested researchers. Also, this dissertation presents the new methodology on WCE analytics using big data generated by Live, Virtual, or/and Constructive (LVC) simulations. This methodology can increase the fidelity of WCE analytics results by considering various factors. It can give opportunities for application of weapon acquisition, operations analytics and plan, and objective level development on each training factor for the weapon operators according to the selection of Measures of Effectiveness (MOEs) and Measures of Performance (MOPs), or impact factors, based on the analytics goal.
Show less
-
Date Issued
-
2018
-
Identifier
-
CFE0007025, ucf:52870
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0007025
-
-
Title
-
A laser radar employing linearly chirped pulses from a mode-locked laser for long range, unambiguous, sub-millimeter resolution ranging and velocimetry.
-
Creator
-
Piracha, Mohammad Umar, ,, University of Central Florida
-
Abstract / Description
-
Light detection and ranging (lidar) is used for various applications such as remote sensing, altimetry and imaging. In this talk, a linearly chirped pulse source is introduced that generates wavelength-swept pulses exhibiting ~6 nm optical bandwidth with (>) 20 km coherence length. The chirped pulses are used in an interferometric lidar setup to perform distance measurements with sub-millimeter resolution (using pulses that are a few meters long), at target distances (>) 10 km, with at least...
Show moreLight detection and ranging (lidar) is used for various applications such as remote sensing, altimetry and imaging. In this talk, a linearly chirped pulse source is introduced that generates wavelength-swept pulses exhibiting ~6 nm optical bandwidth with (>) 20 km coherence length. The chirped pulses are used in an interferometric lidar setup to perform distance measurements with sub-millimeter resolution (using pulses that are a few meters long), at target distances (>) 10 km, with at least 25 dB signal-to-noise ratio at the receiver. A pulse repetition rate of 20 MHz provides fast update rates, while chirped pulse amplification allows easy amplification of optical signals to high power levels that are required for long range operation. A pulse tagging scheme based on phase modulation is used to demonstrate unambiguous, long range measurements. In addition to this, simultaneous measurement of target range and Doppler velocity is performed using a target moving at a speed of over 330 km/h (205 mph) inside the laboratory. In addition to this, spectral phase modulation of the chirped pulses is demonstrated to compensate for the undesirable ripple in the group delay of the chirped pulses. Moreover, spectral amplitude modulation is used to generate pulses with Gaussian temporal intensity profiles and a two-fold increase in the lidar range resolution (284 um) is observed.
Show less
-
Date Issued
-
2012
-
Identifier
-
CFE0004423, ucf:49409
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0004423
-
-
Title
-
FACTORS INFLUENCING EFFECTIVENESS OF INTERORGANIZATIONAL NETWORKS AMONG CRISIS MANAGEMENT ORGANIZATIONS: A COMPARATIVE PERSPECTIVE.
-
Creator
-
Sahin, Bahadir, Wan, Thomas, University of Central Florida
-
Abstract / Description
-
Crisis management has become one of the most important public policy areas in recent decades with greater numbers of manmade and natural disasters. History showed that well-implemented crisis management policies can save lives and reduce costs in a disaster. Literature offered various suggestions for more effective crisis management policies with different techniques utilizing different theoretical frameworks. Informal relationships among crisis management employees were suggested to have a...
Show moreCrisis management has become one of the most important public policy areas in recent decades with greater numbers of manmade and natural disasters. History showed that well-implemented crisis management policies can save lives and reduce costs in a disaster. Literature offered various suggestions for more effective crisis management policies with different techniques utilizing different theoretical frameworks. Informal relationships among crisis management employees were suggested to have a positive impact on crisis management effectiveness. Yet it was not demonstrated with advanced statistical tools if there is such a relationship. This study considers crisis management effort as a network effort and employs complex adaptive systems theory in order to understand factors influencing effectiveness of crisis management networks. Complex adaptive systems theory presents that more open communication lines in a given network or an organization would increase effectiveness of it since inner processes of the network or organization would obtain more information from the chaotic environment. Quality of informal relationships (casual relationships, social capital etc.) was hypothesized as a tool to open more communication lines within an agency which would eventually increase effectiveness of the network constructed by the organization. Based on the theoretical framework, adaptiveness capacity of the agencies was also tested in order to understand a correlation between adaptation and effectiveness of crisis management networks. Multiple case-study method was employed to identify incidents that can represent crisis management in full perspective. Terrorist attacks carried upon by the same terrorist network hit New York in 2001, Istanbul in 2003, Madrid in 2004, and London in 2005 were selected. First response phase of crisis management and policy changes after and before the attacks were discussed. Public administration processes and other social-economical conditions of countries were examined in terms of crisis management structure. Names of key agencies of selected crisis management systems were suggested by a social network analysis tool-UCINET. Six key agencies per incident were targeted for surveys. Surveys included a nine-item-quality of informal relationships, four-item-adaptiveness capability, and ten-item-perceived effectiveness of crisis management networks-scales. Respondents were asked to fill in online surveys where they could refer to their colleagues in the same incidents. 230 respondents were aimed and 246 survey responses were obtained as a result. Surveys formed a structural equation model representing 23 observed factors and 2 latent constructs. Confirmatory factor analysis was utilized to validate hypothesis-driven conceptual models. Quality of informal relationships was found to have a significant positive impact on perceived crisis management network effectiveness (Standardized regression coefficient = .39). Two of the adaptiveness variables, openness to change and intra-organizational training were also positively correlated with the dependent variable of the study (Standardized regression coefficient = .40 and .26 respectively). Turkish and American groups' differences suggested a social-economical difference in societies. Majority of the respondents were some type of managers which made it possible to generalize the results for all phases of crisis management. Discussions suggested improved informal relationships among crisis management employees to provide a better crisis management during an extreme event. Collaborative social events were offered to improve crisis management effectiveness. An agency's openness to change proposed that a crisis management organization should be flexible in rules and structureto gain more efficacy. The other adaptiveness variable, intra-organizational training efforts were proposed to have certain influence on effectiveness of crisis management network. Factors built latent construct of perceived crisis management effectiveness were also found out to be important on crisis management, which of some are ability to carry out generic crisis management functions, mobilize personnel and resources efficiently, process information adequately, blend emergent and established entities, provide appropriate reports for news media etc. Study contributed to the complex adaptive system theory since the fundamentals of the theory were tested with an advanced quantitative method. Non-linear relationships within a system were tested in order to reveal a correlation as the theory suggested, where the results were convincingly positive. Crisis management networks' effectiveness was demonstrated to be validated by a ten-item-scale successfully. Future research might utilize more disaster cases both natural and manmade, search for impact of different communication tools within a system, and look at the relationships among members of crisis management networks instead looking within an organization.
Show less
-
Date Issued
-
2009
-
Identifier
-
CFE0002709, ucf:48173
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0002709
-
-
Title
-
TEACHING AND ASSESSING CRITICAL THINKING IN RADIOLOGIC TECHNOLOGY STUDENTS.
-
Creator
-
Gosnell, Susan, Biraimah, Karen, University of Central Florida
-
Abstract / Description
-
The purpose of this study was primarily to explore the conceptualization of critical thinking development in radiologic science students by radiography program directors. Seven research questions framed three overriding themes including 1) perceived definition of and skills associated with critical thinking; 2) effectiveness and utilization of teaching strategies for the development of critical thinking; and 3) appropriateness and utilization of specific assessment measures for documenting...
Show moreThe purpose of this study was primarily to explore the conceptualization of critical thinking development in radiologic science students by radiography program directors. Seven research questions framed three overriding themes including 1) perceived definition of and skills associated with critical thinking; 2) effectiveness and utilization of teaching strategies for the development of critical thinking; and 3) appropriateness and utilization of specific assessment measures for documenting critical thinking development. The population for this study included program directors for all JRCERT accredited radiography programs in the United States. Questionnaires were distributed via Survey Monkeyé, a commercial on-line survey tool to 620 programs. A forty-seven percent (n = 295) response rate was achieved and included good representation from each of the three recognized program levels (AS, BS and certificate). Statistical analyses performed on the collected data included descriptive analyses (median, mean and standard deviation) to ascertain overall perceptions of the definition of critical thinking; levels of agreement regarding the effectiveness of listed teaching strategies and assessment measures; and the degree of utilization of the same teaching strategies and assessment measures. Chi squared analyses were conducted to identify differences within each of these themes between various program levels and/or between program directors with various levels of educational preparation as defined by the highest degree earned. Results showed that program directors had a broad and somewhat ambiguous perception of the definition of critical thinking, which included many related cognitive processes that were not always classified as attributes of critical thinking according to the literature, but were consistent with definitions and attributes identified as critical thinking by other allied health professions. These common attributes included creative thinking, decision making, problem solving and clinical reasoning as well as other high-order thinking activities such as reflection, judging and reasoning deductively and inductively. Statistically significant differences were identified for some items based on program level and for one item based on program director highest degree. There was general agreement regarding the appropriateness of specific teaching strategies also supported by the literature with the exception of on-line discussions and portfolios. The most highly used teaching strategies reported were not completely congruent with the literature and included traditional lectures with in-class discussions and high-order multiple choice test items. Significant differences between program levels were identified for only two items. The most highly used assessment measures included clinical competency results, employer surveys, image critique performance, specific course assignments, student surveys and ARRT exam results. Only one variable showed significant differences between programs at various academic levels.
Show less
-
Date Issued
-
2010
-
Identifier
-
CFE0003261, ucf:48518
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0003261
-
-
Title
-
Nano-pipette as nanoparticle analyzer and capillary gated ion transistor.
-
Creator
-
Rudzevich, Yauheni, Chow, Lee, Heinrich, Helge, Schulte, Alfons, Yuan, Jiann-Shiun, University of Central Florida
-
Abstract / Description
-
The ability to precisely count inorganic and organic nanoparticles and to measure their size distribution plays a major role in various applications such as drug delivery, nanoparticles counting, and many others. In this work I present a simple resistive pulse method that allows translocations, counting, and measuring the size and velocity distribution of silica nanoparticles and liposomes with diameters from 50 nm to 250 nm. This technique is based on the Coulter counter technique, but has...
Show moreThe ability to precisely count inorganic and organic nanoparticles and to measure their size distribution plays a major role in various applications such as drug delivery, nanoparticles counting, and many others. In this work I present a simple resistive pulse method that allows translocations, counting, and measuring the size and velocity distribution of silica nanoparticles and liposomes with diameters from 50 nm to 250 nm. This technique is based on the Coulter counter technique, but has nanometer size pores. It was found that ionic current drops when nanoparticles enter the nanopore of a pulled micropipette, producing a clear translocation signal. Pulled borosilicate micropipettes with opening 50 ~ 350 nm were used as the detecting instrument. This method provides a direct, fast and cost-effective way to characterize inorganic and organic nanoparticles in a solution. In this work I also introduce a newly developed Capillary Ionic Transistor (CIT). It is presented as a nanodevice which provides control of ionic transport through nanochannel by gate voltage. CIT is Ionic transistor, which employs pulled capillary as nanochannel with a tip diameter smaller than 100 mm. We observed that the gate voltage applied to gate electrode, deposited on the outer wall of a capillary, affect a conductance of nanochannel, due to change of surface charge at the solution/capillary interface. Negative gate voltage corresponds to lower conductivity and positive gate increases conductance of the channel. This effect strongly depends on the size of the channel. In general, at least one dimension of the channel has to be small enough for electrical double layer to overlap.
Show less
-
Date Issued
-
2014
-
Identifier
-
CFE0005880, ucf:50882
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0005880
-
-
Title
-
Assessing the Impact of Economies of Scale and Uncontrollable Factors on the Performance of U.S. Cities.
-
Creator
-
Allaf, Mamoon, Martin, Lawrence, Wan, Thomas, Kapucu, Naim, Baker, Paul, University of Central Florida
-
Abstract / Description
-
Despite the increased interest among local governments in collecting data on performance measurement, empirical evidence is still limited regarding the extent to which these data are utilized to assess the impact on efficiency of economies of scale and uncontrollable factors. Data envelopment analysis (DEA) is a linear programming method designed to estimate the relative efficiency of decision-making units. In addition to assessing relative efficiency, DEA can estimate scale efficiency and...
Show moreDespite the increased interest among local governments in collecting data on performance measurement, empirical evidence is still limited regarding the extent to which these data are utilized to assess the impact on efficiency of economies of scale and uncontrollable factors. Data envelopment analysis (DEA) is a linear programming method designed to estimate the relative efficiency of decision-making units. In addition to assessing relative efficiency, DEA can estimate scale efficiency and incorporate the impact of uncontrollable factors. Using data from the International City/County Association (ICMA), this study utilized DEA to evaluate the impact of economies of scale and uncontrollable factors on the relative efficiency of municipal service delivery in the United States. The findings from this doctoral dissertation show that uncontrollable variables such as population density, unemployment, and household income suppress the relative efficiency of local governments. Moreover, the findings imply that the prevalence of economies of scale in city governments depends on the types of services these governments provide.
Show less
-
Date Issued
-
2012
-
Identifier
-
CFE0004201, ucf:49002
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0004201
-
-
Title
-
Pedestrian Safety Analysis through Effective Exposure Measures and Examination of Injury Severity.
-
Creator
-
Shah, Md Imran, Abdel-Aty, Mohamed, Eluru, Naveen, Lee, JaeYoung, University of Central Florida
-
Abstract / Description
-
Pedestrians are considered the most vulnerable road users who are directly exposed to traffic crashes. In 2014, there were 4,884 pedestrians killed and 65,000 injured in the United States. Pedestrian safety is a growing concern in the development of sustainable transportation system. But often it is found that safety analysis suffers from lack of accurate pedestrian trip information. In such cases, determining effective exposure measures is the most appropriate safety analysis approach. Also...
Show morePedestrians are considered the most vulnerable road users who are directly exposed to traffic crashes. In 2014, there were 4,884 pedestrians killed and 65,000 injured in the United States. Pedestrian safety is a growing concern in the development of sustainable transportation system. But often it is found that safety analysis suffers from lack of accurate pedestrian trip information. In such cases, determining effective exposure measures is the most appropriate safety analysis approach. Also it is very important to clearly understand the relationship between pedestrian injury severity and the factors contributing to higher injury severity. Accurate safety analysis can play a vital role in the development of appropriate safety countermeasures and policies for pedestrians.Since pedestrian volume data is the most important information in safety analysis but rarely available, the first part of the study aims at identifying surrogate measures for pedestrian exposure at intersections. A two-step process is implemented: the first step is the development of Tobit and Generalized Linear Models for predicting pedestrian trips (i.e., exposure models). In the second step, Negative Binomial and Zero Inflated Negative Binomial crash models were developed using the predicted pedestrian trips. The results indicate that among various exposure models the Tobit model performs the best in describing pedestrian exposure. The identified exposure relevant factors are the presence of schools, car-ownership, pavement condition, sidewalk width, bus ridership, intersection control type and presence of sidewalk barrier. The t-test and Wilcoxon signed-rank test results show that there is no significant difference between the observed and the predicted pedestrian trips. The process implemented can help in estimating reliable safety performance functions even when pedestrian trip data is unavailable.The second part of the study focuses on analyzing pedestrian injury severity for the nine counties in Central Florida. The study region covers the Orlando area which has the second-worst pedestrian death rate in the country. Since the dependent variable 'Injury' is ordinal, an 'Ordered Logit' model was developed to identify the factors of pedestrian injury severity. The explanatory variables can be classified as pedestrian/driver characteristics (e.g., age, gender, etc.), roadway traffic and geometric conditions (e.g.: shoulder presence, roadway speed etc.) and crash environment (e.g., light, road surface, work zone, etc.) characteristics. The results show that drug/alcohol involvement, pedestrians in a hurry, roadway speed limit 40 mph or more, dark condition (lighted and unlighted) and presence of elder pedestrians are the primary contributing factors of severe pedestrian crashes in Central Florida. Crashes within the presence of intersections and local roads result in lower injury severity. The area under the ROC (Receiver Operating Characteristic) curve has a value of 0.75 that indicates the model performs reasonably well. Finally the study validated the model using k-fold cross validation method. The results could be useful for transportation officials for further pedestrian safety analysis and taking the appropriate safety interventions.Walking is cost-effective, environmentally friendly and possesses significant health benefits. In order to get these benefits from walking, the most important task is to ensure safer roads for pedestrians.
Show less
-
Date Issued
-
2017
-
Identifier
-
CFE0006656, ucf:51224
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0006656
-
-
Title
-
An investigation of physiological measures in a marketing decision task.
-
Creator
-
Lerma, Nelson, Karwowski, Waldemar, Elshennawy, Ahmad, Xanthopoulos, Petros, Reinerman, Lauren, University of Central Florida
-
Abstract / Description
-
The objective of the present study was to understand the use of physiological measures as an alternative to traditional market research tools, such as self-reporting measures and focus groups. For centuries, corporations and researchers have relied almost exclusively on traditional measures to gain insights into consumer behavior. Oftentimes, traditional methods have failed to accurately predict consumer demand, and this has prompted corporations to explore alternative methods that will...
Show moreThe objective of the present study was to understand the use of physiological measures as an alternative to traditional market research tools, such as self-reporting measures and focus groups. For centuries, corporations and researchers have relied almost exclusively on traditional measures to gain insights into consumer behavior. Oftentimes, traditional methods have failed to accurately predict consumer demand, and this has prompted corporations to explore alternative methods that will accurately forecast future sales. One the most promising alternative methods currently being investigated is the use of physiological measures as an indication of consumer preference. This field, also referred to as neuromarketing, has blended the principles of psychology, neuroscience, and market research to explore consumer behavior from a physiological perspective. The goal of neuromarketing is to capture consumer behavior through the use of physiological sensors. This study investigated the extent to which physiological measures where correlated to consumer preferences by utilizing five physiological sensors which included two neurological sensors (EEG and ECG) two hemodynamic sensors (TCD and fNIR) and one optic sensor (eye-tracking). All five physiological sensors were used simultaneously to capture and record physiological changes during four distinct marketing tasks. The results showed that only one physiological sensor, EEG, was indicative of concept type and intent to purchase. The remaining four physiological sensors did not show any significant differences for concept type or intent to purchase.Furthermore, Machine Learning Algorithms (MLAs) were used to determine the extent to which MLAs (Na(&)#239;ve Bayes, Multilayer Perceptron, K-Nearest Neighbor, and Logistic Regression) could classify physiological responses to self-reporting measures obtained during a marketing task. The results demonstrated that Multilayer Perceptron, on average, performed better than the other MLAs for intent to purchase and concept type. It was also evident that the models faired best with the most popular concept when categorizing the data based on intent to purchase or final selection. Overall, the four models performed well at categorizing the most popular concept and gave some indication to the extent to which physiological measures are capable of capturing intent to purchase. The research study was intended to help better understand the possibilities and limitations of physiological measures in the field of market research. Based on the results obtained, this study demonstrated that certain physiological sensors are capable of capturing emotional changes, but only when the emotional response between two concepts is significantly different. Overall, physiological measures hold great promise in the study of consumer behavior, providing great insight on the relationship between emotions and intentions in market research.
Show less
-
Date Issued
-
2015
-
Identifier
-
CFE0006345, ucf:51563
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0006345
-
-
Title
-
DIGITIZATION PROTOCOLS AND APPLICATIONS FOR LASER SCANNING HUMAN BONE IN FORENSIC ANTHROPOLOGY.
-
Creator
-
Filiault, Matthew, Schultz, John, University of Central Florida
-
Abstract / Description
-
In medico-legal investigations involving unidentified skeletal remains, forensic anthropologists commonly assist law enforcement and medical examiners in their analysis and identification. The traditional documentation techniques employed by the forensic anthropologist during their analysis include notes, photographs, measurements and radiographic images. However, relevant visual information of the skeleton can be lacking in morphological details in 2D images. By creating a 3D representation...
Show moreIn medico-legal investigations involving unidentified skeletal remains, forensic anthropologists commonly assist law enforcement and medical examiners in their analysis and identification. The traditional documentation techniques employed by the forensic anthropologist during their analysis include notes, photographs, measurements and radiographic images. However, relevant visual information of the skeleton can be lacking in morphological details in 2D images. By creating a 3D representation of individual bones using a laser-scanner, it would be possible to overcome this limitation. Now that laser scanners have become increasingly affordable, this technology should be incorporated in the documentation methodologies of forensic anthropology laboratories. Unfortunately, this equipment is rarely used in forensic anthropology casework. The goal of this project is to investigate the possible visualization applications that can be created from digitized surface models of bone for use in medico-legal investigations. This research will be achieved in two phases. First, examples of human bone as well as replicas of bone will be scanned using a NextEngine™ laser scanner. In conjunction with this will be the exploration and documentation of protocols for scanning different bone types and processing the scan data for creating a 3D model. The second phase will investigate how the resulting 3D model can be used in lieu of the actual remains to achieve improved documentation methodologies through the use of several commercial computer graphics programs. The results demonstrate that an array of visual applications can be easily created from a 3D file of bone, including virtual curation, measurement, illustration and the virtual reconstruction of fragmented bone. Based on the findings of this project, the implementation of laser scanning technology is recommended for forensic anthropology labs to enhance documentation, analysis and presentation of human bone.
Show less
-
Date Issued
-
2012
-
Identifier
-
CFH0004287, ucf:44907
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFH0004287
-
-
Title
-
Experimental confirmation of ballistic nanofriction and quasiparticle interference in Dirac materials.
-
Creator
-
Lodge, Michael, Ishigami, Masahiro, Kaden, William, Schelling, Patrick, Del Barco, Enrique, Roy, Tania, University of Central Florida
-
Abstract / Description
-
This dissertation is broadly divided into two parts. The first part details the development and usage of an experimental apparatus to measure the dry nanofriction for a well-defined interface at high sliding speeds. I leverage the sensitivity of a quartz crystal microbalance (QCM) to determine the drag coefficient of an ensemble of gold nanocrystals sliding on graphene at speeds up to 11 cm/s. I discuss the theories of velocity-dependent friction, especially at high sliding speeds, and QCM...
Show moreThis dissertation is broadly divided into two parts. The first part details the development and usage of an experimental apparatus to measure the dry nanofriction for a well-defined interface at high sliding speeds. I leverage the sensitivity of a quartz crystal microbalance (QCM) to determine the drag coefficient of an ensemble of gold nanocrystals sliding on graphene at speeds up to 11 cm/s. I discuss the theories of velocity-dependent friction, especially at high sliding speeds, and QCM modeling. I also discuss our synthesis protocols for graphene and molybdenum disulfide, as well as our protocol for fabricating a clean, graphene-laminated QCM device and nanocrystal ensemble. The design and fabrication of our QCM oscillator circuit is presented in detail. The quantitatively-measured the drag coefficient is compared against molecular dynamics simulations at both low and high sliding speeds. We show evidence of a predicted ultra-low friction regime and find that the interaction energy between gold nanocrystals and graphene is lower than previously assumed. In the second part of this dissertation, I detail the band structure measurement of a novel semimetal using scanning tunneling microscopy. In particular, I measured the energy-dependenceof quasiparticle interference patterns at the surface of zirconium silicon sulfide (ZrSiS), a topological nodal line semimetal whose charge carrier quasiparticles possess a pseudospin degree offreedom. The aims of this study were to (1) discover the shape of the band structure above the Fermi level along a high-symmetry direction, (2) discover the energetic location of the line node inthe same high-symmetry direction, and (3) discover the selection rules for k transitions. This study confirms the predicted linearity in E(k) of the band structure above the Fermi level. Additionally,we observe an energy-dependent mechanism for pseudospin scattering. This study also provides the first experimentally-derived estimation of the line node position in E(k).
Show less
-
Date Issued
-
2018
-
Identifier
-
CFE0007218, ucf:52222
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0007218
-
-
Title
-
A Framework of Critical Success Factors for Business Organizations that Lead to Performance Excellence Based on a Financial and Quality Systems Assessment.
-
Creator
-
Francisco, Melissa, Elshennawy, Ahmad, Karwowski, Waldemar, Rabelo, Luis, Xanthopoulos, Petros, Weheba, Gamal, University of Central Florida
-
Abstract / Description
-
One of the most important tasks that business leaders undertake in order to achieve a superior market position is strategic planning. Beyond this obligation, business owners desire to maximize profit and maintain steady growth. In order to do this, resources must be invested in the most efficient way possible in order to achieve performance excellence. Adjusting business operations quickly, however, especially in times of economic uncertainty, is extremely difficult. Business leaders...
Show moreOne of the most important tasks that business leaders undertake in order to achieve a superior market position is strategic planning. Beyond this obligation, business owners desire to maximize profit and maintain steady growth. In order to do this, resources must be invested in the most efficient way possible in order to achieve performance excellence. Adjusting business operations quickly, however, especially in times of economic uncertainty, is extremely difficult. Business leaders therefore need insight into which elements of organizational improvement are most effective in order to strategically invest their resources to achieve superior performance in the most efficient way possible.This research examines the results of companies which have a demonstrated ability to achieve performance excellence as defined by the National Institute of Standards and Technology's Malcolm Baldrige Criteria for Performance Excellence. This research examined award-winning applications to determine common input factors, compared the business results of a subset of those award-winners with the overall market for a time-frame of 11 years, and then investigated the profitability, liquidity, debt management, asset management, and per share performance ratios of award-winners compared with their industry peers over 11 years as well.The main focus of this research is to determine whether participation in performance excellence best practices have created value for shareholders and business owners. This objective is achieved through the analysis of performance results of award winning companies. This research demonstrates that the integration of efforts associated with performance excellence is in-fact advantageous.
Show less
-
Date Issued
-
2014
-
Identifier
-
CFE0005331, ucf:50503
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0005331
-
-
Title
-
Engineering and Application of Ultrafast Laser Pulses and Filamentation in Air.
-
Creator
-
Barbieri, Nicholas, Richardson, Martin, University of Central Florida
-
Abstract / Description
-
Continuing advances in laser and photonic technology has seen the development of lasers with increasing power and increasingly short pulsewidths, which have become available over an increasing range of wavelengths. As the availability of laser sources grow, so do their applications. To make better use of this improving technology, understanding and controlling laser propagation in free space is critical, as is understanding the interaction between laser light and matter.The need to better...
Show moreContinuing advances in laser and photonic technology has seen the development of lasers with increasing power and increasingly short pulsewidths, which have become available over an increasing range of wavelengths. As the availability of laser sources grow, so do their applications. To make better use of this improving technology, understanding and controlling laser propagation in free space is critical, as is understanding the interaction between laser light and matter.The need to better control the light obtained from increasingly advanced laser sources leads to the emergence of beam engineering, the systematic understanding and control of light through refractive media and free space. Beam engineering enables control over the beam shape, energy and spectral composition during propagation, which can be achieved through a variety of means. In this dissertation, several methods of beam engineering are investigated. These methods enable improved control over the shape and propagation of laser light. Laser-matter interaction is also investigated, as it provides both a means to control the propagation of pulsed laser light through the atmosphere, and provides a means to generation remote sources of radiation.
Show less
-
Date Issued
-
2013
-
Identifier
-
CFE0004650, ucf:49881
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0004650
-
-
Title
-
Uncovering The Sub-Text: Presidents' Emotional Expressions and Major Uses of Force.
-
Creator
-
Assaf, Elias, Houghton, David, Kim, Myunghee, Dolan, Thomas, University of Central Florida
-
Abstract / Description
-
The global context of decision making continues to adapt in response to international threats. Political psychologists have therefore considered decision making processes regarding major uses of force a key area of interest. Although presidential personality has been widely studied as a mitigating factor in the decision making patterns leading to uses of force, traditional theories have not accounted for the emotions of individuals as they affect political actions and are used to frame public...
Show moreThe global context of decision making continues to adapt in response to international threats. Political psychologists have therefore considered decision making processes regarding major uses of force a key area of interest. Although presidential personality has been widely studied as a mitigating factor in the decision making patterns leading to uses of force, traditional theories have not accounted for the emotions of individuals as they affect political actions and are used to frame public perception of the use of force. This thesis therefore measures expressed emotion and cognitive expressions in the form of expressed aggression, passivity, blame, praise, certainty, realism, and optimism as a means of predicting subsequent major uses of force. Since aggression and blame are precipitated by anger and perceived vulnerability, they are theorized to foreshadow increased uses of force (Gardner and Moore 2008). Conversely, passivity and praise are indicative of empathy and joy respectively, and are not expected to precede aggressive behavior conducted to maintain emotional regulation (Roberton, Daffer, and Bucks 2012). Additionally, the three cognitive variables of interest expand on existing literature on beliefs and decision making expounded by such authors as Walker (2010), Winter (2003) and Hermann (2003). DICTION 6.0 is used to analyze all text data of presidential news conferences, candidate debates, and State of the Union speeches given between 1945 and 2000 stored by The American Presidency Project (Hart and Carroll 2012). Howell and Pevehouse's (2005) quantitative assessment of quarterly U.S. uses of force between 1945 and 2000 is employed as a means of quantifying instances of major uses of force. Results show systematic differences among the traits expressed by presidents, with most expressions staying consistent across spontaneous speech contexts. Additionally, State of the Union speeches consistently yielded the highest scores across the expressed traits measured; supporting the theory that prepared speech is used to emotionally frame situations and setup emotional interpretations of events to present to the public. Time sensitive regression analyses indicate that expressed aggression within the context of State of the Union Addresses is the only significant predictor of major uses of force by the administration. That being said, other studies may use the comparative findings presented herein to further establish a robust model of personality that accounts for individual dispositions toward emotional expression as a means of framing the emotional interpretation of events by audiences.
Show less
-
Date Issued
-
2014
-
Identifier
-
CFE0005300, ucf:50513
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0005300
-
-
Title
-
REFRACTIVE INDICES OF LIQUID CRYSTALS AND THEIR APPLICATIONS IN DISPLAY AND PHOTONIC DEVICES.
-
Creator
-
Li, Jun, Wu, Shin-Tson, University of Central Florida
-
Abstract / Description
-
Liquid crystals (LCs) are important materials for flat panel display and photonic devices. Most LC devices use electrical field-, magnetic field-, or temperature-induced refractive index change to modulate the incident light. Molecular constituents, wavelength, and temperature are the three primary factors determining the liquid crystal refractive indices: ne and no for the extraordinary and ordinary rays, respectively. In this dissertation, we derive several physical models for describing...
Show moreLiquid crystals (LCs) are important materials for flat panel display and photonic devices. Most LC devices use electrical field-, magnetic field-, or temperature-induced refractive index change to modulate the incident light. Molecular constituents, wavelength, and temperature are the three primary factors determining the liquid crystal refractive indices: ne and no for the extraordinary and ordinary rays, respectively. In this dissertation, we derive several physical models for describing the wavelength and temperature effects on liquid crystal refractive indices, average refractive index, and birefringence. Based on these models, we develop some high temperature gradient refractive index LC mixtures for photonic applications, such as thermal tunable liquid crystal photonic crystal fibers and thermal solitons. Liquid crystal refractive indices decrease as the wavelength increase. Both ne and no saturate in the infrared region. Wavelength effect on LC refractive indices is important for the design of direct-view displays. In Chapter 2, we derive the extended Cauchy models for describing the wavelength effect on liquid crystal refractive indices in the visible and infrared spectral regions based on the three-band model. The three-coefficient Cauchy model could be used for describing the refractive indices of liquid crystals with low, medium, and high birefringence, whereas the two-coefficient Cauchy model is more suitable for low birefringence liquid crystals. The critical value of the birefringence is deltan~0.12. Temperature is another important factor affecting the LC refractive indices. The thermal effect originated from the lamp of projection display would affect the performance of the employed liquid crystal. In Chapter 3, we derive the four-parameter and three-parameter parabolic models for describing the temperature effect on the LC refractive indices based on Vuks model and Haller equation. We validate the empirical Haller equation quantitatively. We also validate that the average refractive index of liquid crystal decreases linearly as the temperature increases. Liquid crystals exhibit a large thermal nonlinearity which is attractive for new photonic applications using photonic crystal fibers. We derive the physical models for describing the temperature gradient of the LC refractive indices, ne and no, based on the four-parameter model. We find that LC exhibits a crossover temperature To at which dno/dT is equal to zero. The physical models of the temperature gradient indicate that ne, the extraordinary refractive index, always decreases as the temperature increases since dne/dT is always negative, whereas no, the ordinary refractive index, decreases as the temperature increases when the temperature is lower than the crossover temperature (dno/dT<0 when the temperature is lower than To) and increases as the temperature increases when the temperature is higher than the crossover temperature (dno/dT>0 when the temperature is higher than To ). Measurements of LC refractive indices play an important role for validating the physical models and the device design. Liquid crystal is anisotropic and the incident linearly polarized light encounters two different refractive indices when the polarization is parallel or perpendicular to the optic axis. The measurement is more complicated than that for an isotropic medium. In Chapter 4, we use a multi-wavelength Abbe refractometer to measure the LC refractive indices in the visible light region. We measured the LC refractive indices at six wavelengths, lamda=450, 486, 546, 589, 633 and 656 nm by changing the filters. We use a circulating constant temperature bath to control the temperature of the sample. The temperature range is from 10 to 55 oC. The refractive index data measured include five low-birefringence liquid crystals, MLC-9200-000, MLC-9200-100, MLC-6608 (delta_epsilon=-4.2), MLC-6241-000, and UCF-280 (delta_epsilon=-4); four middle-birefringence liquid crystals, 5CB, 5PCH, E7, E48 and BL003; four high-birefringence liquid crystals, BL006, BL038, E44 and UCF-35, and two liquid crystals with high dno/dT at room temperature, UCF-1 and UCF-2. The refractive indices of E7 at two infrared wavelengths lamda=1.55 and 10.6 um are measured by the wedged-cell refractometer method. The UV absorption spectra of several liquid crystals, MLC-9200-000, MLC-9200-100, MLC-6608 and TL-216 are measured, too. In section 6.5, we also measure the refractive index of cured optical films of NOA65 and NOA81 using the multi-wavelength Abbe refractometer. In Chapter 5, we use the experimental data measured in Chapter 4 to validate the physical models we derived, the extended three-coefficient and two-coefficient Cauchy models, the four-parameter and three-parameter parabolic models. For the first time, we validate the Vuks model using the experimental data of liquid crystals directly. We also validate the empirical Haller equation for the LC birefringence delta_n and the linear equation for the LC average refractive index . The study of the LC refractive indices explores several new photonic applications for liquid crystals such as high temperature gradient liquid crystals, high thermal tunable liquid crystal photonic crystal fibers, the laser induced 2D+1 thermal solitons in nematic crystals, determination for the infrared refractive indices of liquid crystals, comparative study for refractive index between liquid crystals and photopolymers for polymer dispersed liquid crystal (PDLC) applications, and so on. In Chapter 6, we introduce these applications one by one. First, we formulate two novel liquid crystals, UCF-1 and UCF-2, with high dno/dT at room temperature. The dno/dT of UCF-1 is about 4X higher than that of 5CB at room temperature. Second, we infiltrate UCF-1 into the micro holes around the silica core of a section of three-rod core PCF and set up a highly thermal tunable liquid crystal photonic crystal fiber. The guided mode has an effective area of 440 Ým2 with an insertion loss of less than 0.5dB. The loss is mainly attributed to coupling losses between the index-guided section and the bandgap-guided section. The thermal tuning sensitivity of the spectral position of the bandgap was measured to be 27 nm/degree around room temperature, which is 4.6 times higher than that using the commercial E7 LC mixture operated at a temperature above 50 degree C. Third, the novel liquid crystals UCF-1 and UCF-2 are preferred to trigger the laser-induced thermal solitons in nematic liquid crystal confined in a capillary because of the high positive temperature gradient at room temperature. Fourth, we extrapolate the refractive index data measured at the visible light region to the near and far infrared region basing on the extended Cauchy model and four-parameter model. The extrapolation method is validated by the experimental data measured at the visible light and infrared light regions. Knowing the LC refractive indices at the infrared region is important for some photonic devices operated in this light region. Finally, we make a completely comparative study for refractive index between two photocurable polymers (NOA65 and NOA81) and two series of Merck liquid crystals, E-series (E44, E48, and E7) and BL-series (BL038, BL003 and BL006) in order to optimize the performance of polymer dispersed liquid crystals (PDLC). Among the LC materials we studied, BL038 and E48 are good candidates for making PDLC system incorporating NOA65. The BL038 PDLC cell shows a higher contrast ratio than the E48 cell because BL038 has a better matched ordinary refractive index, higher birefringence, and similar miscibility as compared to E48. Liquid crystals having a good miscibility with polymer, matched ordinary refractive index, and higher birefringence help to improve the PDLC contrast ratio for display applications. In Chapter 7, we give a general summary for the dissertation.
Show less
-
Date Issued
-
2005
-
Identifier
-
CFE0000808, ucf:46677
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0000808
-
-
Title
-
Investigating the universality and comprehensive ability of measures to assess the state of workload.
-
Creator
-
Abich, Julian, Reinerman, Lauren, Lackey, Stephanie, Szalma, James, Taylor, Grant, University of Central Florida
-
Abstract / Description
-
Measures of workload have been developed on the basis of the various definitions, some are designed to capture the multi-dimensional aspects of a unitary resource pool (Kahneman, 1973) while others are developed on the basis of multiple resource theory (Wickens, 2002). Although many theory based workload measures exist, others have often been constructed to serve the purpose of specific experimental tasks. As a result, it is likely that not every workload measure is reliable and valid for all...
Show moreMeasures of workload have been developed on the basis of the various definitions, some are designed to capture the multi-dimensional aspects of a unitary resource pool (Kahneman, 1973) while others are developed on the basis of multiple resource theory (Wickens, 2002). Although many theory based workload measures exist, others have often been constructed to serve the purpose of specific experimental tasks. As a result, it is likely that not every workload measure is reliable and valid for all tasks, much less each domain. To date, no single measure, systematically tested across experimental tasks, domains, and other measures is considered a universal measure of workload. Most researchers would argue that multiple measures from various categories should be applied to a given task to comprehensively assess workload. The goal for Study 1 to establish task load manipulations for two theoretically different tasks that induce distinct levels of workload assessed by both subjective and performance measures was successful. The results of the subjective responses support standardization and validation of the tasks and demands of that task for investigating workload. After investigating the use of subjective and objective measures of workload to identify a universal and comprehensive measure or set of measures, based on Study 2, it can only be concluded that not one or a set of measures exists. Arguably, it is not to say that one will never be conceived and developed, but at this time, one does not reside in the psychometric catalog. Instead, it appears that a more suitable approach is to customize a set of workload measures based on the task. The novel approach of assessing the sensitivity and comprehensive ability of conjointly utilizing subjective, performance, and physiological workload measures for theoretically different tasks within the same domain contributes to the theory by laying the foundation for improving methodology for researching workload. The applicable contribution of this project is a stepping-stone towards developing complex profiles of workload for use in closed-loop systems, such as human-robot team interaction. Identifying the best combination of workload measures enables human factors practitioners, trainers, and task designers to improve methodology and evaluation of system designs, training requirements, and personnel selection.
Show less
-
Date Issued
-
2013
-
Identifier
-
CFE0005119, ucf:50675
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0005119
-
-
Title
-
SPRAY COOLING FOR LAND, SEA, AIR AND SPACE BASED APPLICATIONS,A FLUID MANAGEMENT SYSTEM FOR MULTIPLE NOZZLE SPRAY COOLING AND A GUIDE TO HIGH HEAT FLUX HEATER DESIGN.
-
Creator
-
Glassman, Brian, Chow, Louis, University of Central Florida
-
Abstract / Description
-
This thesis is divided into four distinct chapters all linked by the topic of spray cooling. Chapter one gives a detailed categorization of future and current spray cooling applications, and reviews the major advantages and disadvantages that spray cooling has over other high heat flux cooling techniques. Chapter two outlines the developmental goals of spray cooling, which are to increase the output of a current system and to enable new technologies to be technically feasible. Furthermore,...
Show moreThis thesis is divided into four distinct chapters all linked by the topic of spray cooling. Chapter one gives a detailed categorization of future and current spray cooling applications, and reviews the major advantages and disadvantages that spray cooling has over other high heat flux cooling techniques. Chapter two outlines the developmental goals of spray cooling, which are to increase the output of a current system and to enable new technologies to be technically feasible. Furthermore, this chapter outlines in detail the impact that land, air, sea, and space environments have on the cooling system and what technologies could be enabled in each environment with the aid of spray cooling. In particular, the heat exchanger, condenser and radiator are analyzed in their corresponding environments. Chapter three presents an experimental investigation of a fluid management system for a large area multiple nozzle spray cooler. A fluid management or suction system was used to control the liquid film layer thickness needed for effective heat transfer. An array of sixteen pressure atomized spray nozzles along with an imbedded fluid suction system was constructed. Two surfaces were spray tested one being a clear grooved Plexiglas plate used for visualization and the other being a bottom heated grooved 4.5 x 4.5 cm2 copper plate used to determine the heat flux. The suction system utilized an array of thin copper tubes to extract excess liquid from the cooled surface. Pure water was ejected from two spray nozzle configurations at flow rates of 0.7 L/min to 1 L/min per nozzle. It was found that the fluid management system provided fluid removal efficiencies of 98% with a 4-nozzle array, and 90% with the full 16-nozzle array for the downward spraying orientation. The corresponding heat fluxes for the 16 nozzle configuration were found with and without the aid of the fluid management system. It was found that the fluid management system increased heat fluxes on the average of 30 W/cm2 at similar values of superheat. Unfortunately, the effectiveness of this array at removing heat at full levels of suction is approximately 50% & 40% of a single nozzle at respective 10aC & 15aC values of superheat. The heat transfer data more closely resembled convective pooling boiling. Thus, it was concluded that the poor heat transfer was due to flooding occurring which made the heat transfer mechanism mainly forced convective boiling and not spray cooling. Finally, Chapter four gives a detailed guide for the design and construction of a high heat flux heater for experimental uses where accurate measurements of surface temperatures and heat fluxes are extremely important. The heater designs presented allow for different testing applications; however, an emphasis is placed on heaters designed for use with spray cooling.
Show less
-
Date Issued
-
2005
-
Identifier
-
CFE0000473, ucf:46351
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0000473
Pages