Current Search: processing (x)
View All Items
Pages
- Title
- Improved Interpolation in SPH in Cases of Less Smooth Flow.
- Creator
-
Brun, Oddny, Wiegand, Rudolf, Pensky, Marianna, University of Central Florida
- Abstract / Description
-
ABSTRACTWe introduced a method presented in Information Field Theory (IFT) [Abramovich et al.,2007] to improve interpolation in Smoothed Particle Hydrodynamics (SPH) in cases of less smoothflow. The method makes use of wavelet theory combined with B-splines for interpolation. The ideais to identify any jumps a function may have and then reconstruct the smoother segments betweenthe jumps. The results of our work demonstrated superior capability when compared to a particularchallenging SPH...
Show moreABSTRACTWe introduced a method presented in Information Field Theory (IFT) [Abramovich et al.,2007] to improve interpolation in Smoothed Particle Hydrodynamics (SPH) in cases of less smoothflow. The method makes use of wavelet theory combined with B-splines for interpolation. The ideais to identify any jumps a function may have and then reconstruct the smoother segments betweenthe jumps. The results of our work demonstrated superior capability when compared to a particularchallenging SPH application, to better conserve jumps and more accurately interpolate thesmoother segments of the function. The results of our work also demonstrated increased computationalefficiency with limited loss in accuracy as number of multiplications and execution timewere reduced. Similar benefits were observed for functions with spikes analyzed by the samemethod. Lesser, but similar effects were also demonstrated for real life data sets of less smoothnature.SPH is widely used in modeling and simulation of flow of matters. SPH presents advantagescompared to grid based methods both in terms of computational efficiency and accuracy, inparticular when dealing with less smooth flow. The results we achieved through our research is animprovement to the model in cases of less smooth flow, in particular flow with jumps and spikes.Up until now such improvements have been sought through modifications to the models' physicalequations and/or kernel functions and have only partially been able to address the issue.This research, as it introduced wavelet theory and IFT to a field of science that, to ourknowledge, not currently are utilizing these methods, did lay the groundwork for future researchiiiideas to benefit SPH. Among those ideas are further development of criteria for wavelet selection,use of smoothing splines for SPH interpolation and incorporation of Bayesian field theory.Improving the method's accuracy, stability and efficiency under more challenging conditionssuch as flow with jumps and spikes, will benefit applications in a wide area of science. Justin medicine alone, such improvements will further increase real time diagnostics, treatments andtraining opportunities because jumps and spikes are often the characteristics of significant physiologicaland anatomic conditions such as pulsatile blood flow, peristaltic intestine contractions andorgans' edges appearance in imaging.
Show less - Date Issued
- 2016
- Identifier
- CFE0006446, ucf:51451
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006446
- Title
- Computerized Evaluatution of Microsurgery Skills Training.
- Creator
-
Jotwani, Payal, Foroosh, Hassan, Hughes, Charles, Hua, Kien, University of Central Florida
- Abstract / Description
-
The style of imparting medical training has evolved, over the years. The traditional methods of teaching and practicing basic surgical skills under apprenticeship model, no longer occupy the first place in modern technically demanding advanced surgical disciplines like neurosurgery. Furthermore, the legal and ethical concerns for patient safety as well as cost-effectiveness have forced neurosurgeons to master the necessary microsurgical techniques to accomplish desired results. This has lead...
Show moreThe style of imparting medical training has evolved, over the years. The traditional methods of teaching and practicing basic surgical skills under apprenticeship model, no longer occupy the first place in modern technically demanding advanced surgical disciplines like neurosurgery. Furthermore, the legal and ethical concerns for patient safety as well as cost-effectiveness have forced neurosurgeons to master the necessary microsurgical techniques to accomplish desired results. This has lead to increased emphasis on assessment of clinical and surgical techniques of the neurosurgeons. However, the subjective assessment of microsurgical techniques like micro-suturing under the apprenticeship model cannot be completely unbiased. A few initiatives using computer-based techniques, have been made to introduce objective evaluation of surgical skills.This thesis presents a novel approach involving computerized evaluation of different components of micro-suturing techniques, to eliminate the bias of subjective assessment. The work involved acquisition of cine clips of micro-suturing activity on synthetic material. Image processing and computer vision based techniques were then applied to these videos to assess different characteristics of micro-suturing viz. speed, dexterity and effectualness. In parallel subjective grading on these was done by a senior neurosurgeon. Further correlation and comparative study of both the assessments was done to analyze the efficacy of objective and subjective evaluation.
Show less - Date Issued
- 2015
- Identifier
- CFE0006221, ucf:51056
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006221
- Title
- The Hermeneutics of the Hard Drive: Using Narratology, Natural Language Processing, and Knowledge Management to Improve the Effectiveness of the Digital Forensic Process.
- Creator
-
Pollitt, Mark, Applen, John, Bowdon, Melody, Dombrowski, Paul, Craiger, John, University of Central Florida
- Abstract / Description
-
In order to protect the safety of our citizens and to ensure a civil society, we ask our law enforcement, judiciary and intelligence agencies, under the rule of law, to seek probative information which can be acted upon for the common good. This information may be used in court to prosecute criminals or it can be used to conduct offensive or defensive operations to protect our national security. As the citizens of the world store more and more information in digital form, and as they live an...
Show moreIn order to protect the safety of our citizens and to ensure a civil society, we ask our law enforcement, judiciary and intelligence agencies, under the rule of law, to seek probative information which can be acted upon for the common good. This information may be used in court to prosecute criminals or it can be used to conduct offensive or defensive operations to protect our national security. As the citizens of the world store more and more information in digital form, and as they live an ever-greater portion of their lives online, law enforcement, the judiciary and the Intelligence Community will continue to struggle with finding, extracting and understanding the data stored on computers. But this trend affords greater opportunity for law enforcement. This dissertation describes how several disparate approaches: knowledge management, content analysis, narratology, and natural language processing, can be combined in an interdisciplinary way to positively impact the growing difficulty of developing useful, actionable intelligence from the ever-increasing corpus of digital evidence. After exploring how these techniques might apply to the digital forensic process, I will suggest two new theoretical constructs, the Hermeneutic Theory of Digital Forensics and the Narrative Theory of Digital Forensics, linking existing theories of forensic science, knowledge management, content analysis, narratology, and natural language processing together in order to identify and extract narratives from digital evidence. An experimental approach will be described and prototyped. The results of these experiments demonstrate the potential of natural language processing techniques to digital forensics.
Show less - Date Issued
- 2013
- Identifier
- CFE0005112, ucf:50749
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005112
- Title
- An Adhesive Vinyl-Acrylic Electrolyte and Electrode Binder for Lithium Batteries.
- Creator
-
Tran, Binh, Zhai, Lei, Zou, Shengli, Kuebler, Stephen, Hernandez, Florencio, Gesquiere, Andre, University of Central Florida
- Abstract / Description
-
This dissertation describes a new vinyl-acrylic copolymer that displays great potential for applications in lithium ion batteries by enabling novel, faster, safer and cost-effective processes. Understanding the chemistry of materials and processes related to battery manufacturing allows the design of techniques and methods that can ultimately improve the performance of existing batteries while reducing the cost. The first and second parts of this dissertation focuses on the free radical...
Show moreThis dissertation describes a new vinyl-acrylic copolymer that displays great potential for applications in lithium ion batteries by enabling novel, faster, safer and cost-effective processes. Understanding the chemistry of materials and processes related to battery manufacturing allows the design of techniques and methods that can ultimately improve the performance of existing batteries while reducing the cost. The first and second parts of this dissertation focuses on the free radical polymerization of poly(ethylene glycol) methyl ether methacrylate (PEGMA), methyl methacrylate (MMA), and isobutyl vinyl ether (IBVE) monomers to afford a vinyl-acrylic poly(PEGMA-co-MME-co-IBVE) random copolymer and the investigation of its properties as a soluble, amorphous, and adhesive electrolyte that is able to permanently hold 800 times its own weight. Such material properties envision a printable battery manufacturing procedure, since existing electrolytes lack adhesion at a single macromolecular level. Electrolytes can also be used as an electrode binder so long as it has structural integrity and allows ion transfer to and from the active electrode material during insertion/extraction processes. In the third section, the use of this electrolyte as a water-soluble binder for the aqueous fabrication of LiCoO2 cathodes is presented. Results of this study demonstrated the first aqueous process fabrication of thick, flexible, and fully compressed lithium ion battery electrodes by using commercial nickel foam as a supporting current collector. This feat is rather impressive because these properties are far superior to other aqueous binders in terms of material loading per electrode, specific area capacity, durability, and cell resistance. Finally, the fourth section expands on this concept by using the poly(PEGMA-co-MMA-co-IBVE) copolymer for the aqueous fabrication of a low voltage Li4Ti5O12 anode type electrode. Altogether, results demonstrate as a proof of concept that switching the current toxic manufacturing of lithium-ion batteries to an aqueous process is highly feasible. Furthermore, new electrode manufacturing techniques are also deemed possible.
Show less - Date Issued
- 2013
- Identifier
- CFE0004761, ucf:49780
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004761
- Title
- Ultrafast Laser Material Processing For Photonic Applications.
- Creator
-
Ramme, Mark, Richardson, Martin, Fathpour, Sasan, Sundaram, Kalpathy, Kar, Aravinda, University of Central Florida
- Abstract / Description
-
Femtosecond Laser Direct Writing (FLDW) is a viable technique for producing photonic devices in bulk materials. This novel manufacturing technique is versatile due to its full 3D fabrication capability. Typically, the only requirement for this process is that the base material must be transparent to the laser wavelength. The modification process itself is based on non-linear energy absorption of laser light within the focal volume of the incident beam.This thesis addresses the feasibility of...
Show moreFemtosecond Laser Direct Writing (FLDW) is a viable technique for producing photonic devices in bulk materials. This novel manufacturing technique is versatile due to its full 3D fabrication capability. Typically, the only requirement for this process is that the base material must be transparent to the laser wavelength. The modification process itself is based on non-linear energy absorption of laser light within the focal volume of the incident beam.This thesis addresses the feasibility of this technique for introducing photonic structures into novel dielectric materials. Additionally, this work provides a deeper understanding of the light-matter interaction mechanism occurring at high pulse repetition rates. A novel structure on the sample surface in the form of nano-fibers was observed when the bulk material was irradiated with high repetition rate pulse trains.To utilize the advantages of the FLDW technique even further, a transfer of the technology from dielectric to semiconductor materials is investigated. However, this demands detailed insight of the absorption and modification processes themselves. Experiments and the results suggested that non-linear absorption, specifically avalanche ionization, is the limiting factor inhibiting the application of FLDW to bulk semiconductors with today's laser sources.
Show less - Date Issued
- 2013
- Identifier
- CFE0004914, ucf:49626
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004914
- Title
- Mode-Division Multiplexed Transmission in Few-mode Fibers.
- Creator
-
Bai, Neng, Li, Guifang, Christodoulides, Demetrios, Schulzgen, Axel, Abouraddy, Ayman, Phillips, Ronald, Ip, Ezra, University of Central Florida
- Abstract / Description
-
As a promising candidate to break the single-mode fiber capacity limit, mode-division multiplexing (MDM) explores the spatial dimension to increase transmission capacity in fiber-optic communication. Two linear impairments, namely loss and multimode interference, present fundamental challenges to implementing MDM. In this dissertation, techniques to resolve these two issues are presented.To de-multiplex signals subject to multimode interference in MDM, Multiple-Input-Multiple-Output (MIMO)...
Show moreAs a promising candidate to break the single-mode fiber capacity limit, mode-division multiplexing (MDM) explores the spatial dimension to increase transmission capacity in fiber-optic communication. Two linear impairments, namely loss and multimode interference, present fundamental challenges to implementing MDM. In this dissertation, techniques to resolve these two issues are presented.To de-multiplex signals subject to multimode interference in MDM, Multiple-Input-Multiple-Output (MIMO) processing using adaptive frequency-domain equalization (FDE) is proposed and investigated. Both simulations and experiments validate that FDE can reduce the algorithmic complexity significantly in comparison with the conventional time-domain equalization (TDE) while achieving similar performance as TDE. To further improve the performance of FDE, two modifications on traditional FDE algorithm are demonstrated. i) normalized adaptive FDE is applied to increase the convergence speed by 5 times; ii) master-slave carrier recovery is proposed to reduce the algorithmic complexity of phase estimation by number of modes.Although FDE can reduce the computational complexity of the MIMO processing, due to large mode group delay (MGD) of FMF link and block processing, the algorithm still requires enormous memory and high hardware complexity. In order to reduce the required tap length (RTL) of the equalizer, differential mode group delay compensated fiber (DMGDC) has been proposed. In this dissertation, the analytical expression for RTL is derived for DMGDC systems under the weak mode coupling assumption. Instead of depending on the overall MGD of the link in DMGD uncompensated (DMGDUC) systems, the RTL of DMGDC systems depend on the MGD of a single DMGDC fiber section. The theoretical and numerical results suggest that by using small compensation step-size, the RTL of DMGDC link can be reduced by 2 orders of magnitude compared to DMGDUC link. To compensate the loss of different modes, multimode EDFAs are presented with re-configurable multimode pumps. By tuning the mode content of the multimode pump, mode-dependent gain (MDG) can be controlled and equalized. A proto-type FM-EDFA which could support 2 LP modes was constructed. The experimental results show that by using high order mode pumps, the modal gain difference can be reduced. By applying both multimode EDFA and equalization techniques, 26.4Tb/s MDM-WDM transmission was successfully demonstrated.A brief summary and several possible future research directions conclude this dissertation.
Show less - Date Issued
- 2013
- Identifier
- CFE0004811, ucf:49751
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004811
- Title
- Distribution of Laser Induced Heating in Multi-Component Chalcogenide Glass and its Associated Effects.
- Creator
-
Sisken, Laura, Richardson, Kathleen, Richardson, Martin, Shah, Lawrence, University of Central Florida
- Abstract / Description
-
Chalcogenide glasses are well known to have good transparency into the infrared spectrum. These glasses though tend to have low thresholds as compared to oxide glasses for photo-induced changes and thermally-induced changes. Material modification such as photo-induced darkening, bleaching, refractive index change, densification or expansion, ablation of crystallization have been demonstrated, and are typically induced by a thermal furnace-based heat treatment, an optical source such as a...
Show moreChalcogenide glasses are well known to have good transparency into the infrared spectrum. These glasses though tend to have low thresholds as compared to oxide glasses for photo-induced changes and thermally-induced changes. Material modification such as photo-induced darkening, bleaching, refractive index change, densification or expansion, ablation of crystallization have been demonstrated, and are typically induced by a thermal furnace-based heat treatment, an optical source such as a laser, or a combination of photo-thermal interactions. Solely employing laser-based heating has an advantage over a furnace, since one has the potential to be able to spatially modify the materials properties with much greater precision by moving either the beam or the sample.The main properties of ChG glasses investigated in this study were the light-induced and thermally-induced modification of the glass through visible microscopy, white light interferometry, and Raman spectroscopy. Additionally computational models were developed in order to aid in determining what temperature rise should be occurring under the conditions used in experiments.It was seen that ablation, photo-expansion, crystallization, and melting could occur for some of the irradiation conditions that were used. The above bandgap energy simulations appeared to overestimate the maximum temperature that should have been reached in the sample, while the below bandgap energy simulations appeared to underestimate the maximum temperature that should have been reached in the sample. Ultimately, this work produces the ground work to be able to predict and control dose, and therefore heating, to induce localized crystallization and phase change.
Show less - Date Issued
- 2014
- Identifier
- CFE0005261, ucf:50606
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005261
- Title
- Phonological Working Memory Deficits in ADHD Revisited: The Role of Lower-Level Information Processing Deficits in Impaired Working Memory Performance.
- Creator
-
Raiker, Joseph, Rapport, Mark, Beidel, Deborah, Mouloua, Mustapha, Vasquez, Eleazar, University of Central Florida
- Abstract / Description
-
Working memory deficits in children with ADHD are well established; however, insufficient evidence exists concerning the degree to which lower-level cognitive processes contribute to these deficits. The current study dissociates lower level information processing abilities (i.e., visual registration, orthographic conversion, and response output) in children with ADHD and typically developing children and examines the unique contribution of these processes to their phonological working memory...
Show moreWorking memory deficits in children with ADHD are well established; however, insufficient evidence exists concerning the degree to which lower-level cognitive processes contribute to these deficits. The current study dissociates lower level information processing abilities (i.e., visual registration, orthographic conversion, and response output) in children with ADHD and typically developing children and examines the unique contribution of these processes to their phonological working memory performance. Thirty-four boys between 8 and 12 years of age (20 ADHD, 14 typically developing) were administered novel information processing and phonological working memory tasks. Between-group differences were examined and bootstrap mediation analysis was used to evaluate the mediating effect of information processing deficits on phonological working memory performance. Results revealed moderate to large magnitude deficits in visual registration and encoding, orthographic to phonological conversion, and phonological working memory in children with ADHD. Subsequent mediation analyses, however, revealed that visual registration/encoding alone mediated the diagnostic group status/phonological working memory relationship and accounted for approximately 32% of the variance in children's phonological working memory performance. Diagnostic and treatment implications for understanding the complex interplay among multiple cognitive deficits in children with ADHD are discussed.
Show less - Date Issued
- 2014
- Identifier
- CFE0005694, ucf:50141
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005694
- Title
- Batch and Online Implicit Weighted Gaussian Processes for Robust Novelty Detection.
- Creator
-
Ramirez Padron, Ruben, Gonzalez, Avelino, Georgiopoulos, Michael, Stanley, Kenneth, Mederos, Boris, Wang, Chung-Ching, University of Central Florida
- Abstract / Description
-
This dissertation aims mainly at obtaining robust variants of Gaussian processes (GPs) that do not require using non-Gaussian likelihoods to compensate for outliers in the training data. Bayesian kernel methods, and in particular GPs, have been used to solve a variety of machine learning problems, equating or exceeding the performance of other successful techniques. That is the case of a recently proposed approach to GP-based novelty detection that uses standard GPs (i.e. GPs employing...
Show moreThis dissertation aims mainly at obtaining robust variants of Gaussian processes (GPs) that do not require using non-Gaussian likelihoods to compensate for outliers in the training data. Bayesian kernel methods, and in particular GPs, have been used to solve a variety of machine learning problems, equating or exceeding the performance of other successful techniques. That is the case of a recently proposed approach to GP-based novelty detection that uses standard GPs (i.e. GPs employing Gaussian likelihoods). However, standard GPs are sensitive to outliers in training data, and this limitation carries over to GP-based novelty detection. This limitation has been typically addressed by using robust non-Gaussian likelihoods. However, non-Gaussian likelihoods lead to analytically intractable inferences, which require using approximation techniques that are typically complex and computationally expensive. Inspired by the use of weights in quasi-robust statistics, this work introduces a particular type of weight functions, called here data weighers, in order to obtain robust GPs that do not require approximation techniques and retain the simplicity of standard GPs. This work proposes implicit weighted variants of batch GP, online GP, and sparse online GP (SOGP) that employ weighted Gaussian likelihoods. Mathematical expressions for calculating the posterior implicit weighted GPs are derived in this work. In our experiments, novelty detection based on our weighted batch GPs consistently and significantly outperformed standard batch GP-based novelty detection whenever data was contaminated with outliers. Additionally, our experiments show that novelty detection based on online GPs can perform similarly to batch GP-based novelty detection. Membership scores previously introduced by other authors are also compared in our experiments.
Show less - Date Issued
- 2015
- Identifier
- CFE0005869, ucf:50858
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005869
- Title
- HAS THE PENDULUM SWUNG TOO FAR?: A LEGAL EVALUATION OF FLORIDA'S CHILD ABUSE AND NEGLECT REGISTRY.
- Creator
-
Debler, Julianna, Naccarato-Fromang, Gina, University of Central Florida
- Abstract / Description
-
Over the past several years, increasing public emphasis on preventing child maltreatment has resulted in substantial changes to Florida's child abuse and neglect central registry. Many of these recent changes, aimed at preventing child maltreatment, have resulted in over one million false, unsubstantiated, and inconclusive reports of child abuse and neglect within the last decade. While the information held in reports may be useful for identifying and preventing potential child abuse or...
Show moreOver the past several years, increasing public emphasis on preventing child maltreatment has resulted in substantial changes to Florida's child abuse and neglect central registry. Many of these recent changes, aimed at preventing child maltreatment, have resulted in over one million false, unsubstantiated, and inconclusive reports of child abuse and neglect within the last decade. While the information held in reports may be useful for identifying and preventing potential child abuse or neglect, due process concerns have been raised with regards to the process of placing a person's name in a report without providing a hearing for challenging or removing inaccurate information. Focusing on Florida law, this research concentrates on: 1) the child maltreatment reporting process, 2) the procedures for maintaining reports, and 3) the accessibility of these reports in order to determine whether due process constitutional rights are protected under Florida's child abuse and neglect reporting laws. The intent of this thesis is to analyze the occurrence of unsubstantiated cases of child maltreatment, incidences of false reporting, and legal remedies available for those wrongfully accused of abusing or neglecting a child. Through the analysis of case law, federal and state statutes, available statistics, child abuse resources, and personal interviews with members of the Florida Legislature, evidence shows that due process constitutional rights are not protected under Florida's child abuse and neglect reporting laws. By raising awareness of the areas of child protection that require legal re-evaluation, this thesis aims to discover the balance between protecting children from harm and protecting adults from the severe ramifications resulting from false and improper allegations of child abuse and neglect.
Show less - Date Issued
- 2012
- Identifier
- CFH0004267, ucf:44944
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFH0004267
- Title
- AN ART TEACHER'S GUIDE TO A COGNITIVE TEACHING PROCESS: PROMPTING STUDENT'S CREATIVE THOUGHT.
- Creator
-
Warskow, Kristen, Brewer, Thomas, University of Central Florida
- Abstract / Description
-
This paper seeks to further explore stages an artist moves through that can be applied to teaching art, and helping students understand how to access their creativity. This project involved observation and an auto-ethnographic approach in order to best determine stages artists naturally move through when creating art. In order to most effectively suggest a teachable creative process for secondary art students, this paper will further explore cognitive and disciplinary categories in art...
Show moreThis paper seeks to further explore stages an artist moves through that can be applied to teaching art, and helping students understand how to access their creativity. This project involved observation and an auto-ethnographic approach in order to best determine stages artists naturally move through when creating art. In order to most effectively suggest a teachable creative process for secondary art students, this paper will further explore cognitive and disciplinary categories in art education by applying principles and stages to a curricular guide (or lesson plans) for secondary art educators. Topics and studies of design thinking, creative inquiry, studio habits, creative processes, the National Assessment for Educational Progress (NAEP, 2008), and National Core Art Standards will be reviewed and expanded upon in this paper. Using these inputs, a series of 4 recursive, creative stages were observed and applied to teaching art at the secondary (6th-12th grade) levels.
Show less - Date Issued
- 2014
- Identifier
- CFH0004692, ucf:45238
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFH0004692
- Title
- Simulation, Analysis, and Optimization of Heterogeneous CPU-GPU Systems.
- Creator
-
Giles, Christopher, Heinrich, Mark, Ewetz, Rickard, Lin, Mingjie, Pattanaik, Sumanta, Flitsiyan, Elena, University of Central Florida
- Abstract / Description
-
With the computing industry's recent adoption of the Heterogeneous System Architecture (HSA) standard, we have seen a rapid change in heterogeneous CPU-GPU processor designs. State-of-the-art heterogeneous CPU-GPU processors tightly integrate multicore CPUs and multi-compute unit GPUs together on a single die. This brings the MIMD processing capabilities of the CPU and the SIMD processing capabilities of the GPU together into a single cohesive package with new HSA features comprising better...
Show moreWith the computing industry's recent adoption of the Heterogeneous System Architecture (HSA) standard, we have seen a rapid change in heterogeneous CPU-GPU processor designs. State-of-the-art heterogeneous CPU-GPU processors tightly integrate multicore CPUs and multi-compute unit GPUs together on a single die. This brings the MIMD processing capabilities of the CPU and the SIMD processing capabilities of the GPU together into a single cohesive package with new HSA features comprising better programmability, coherency between the CPU and GPU, shared Last Level Cache (LLC), and shared virtual memory address spaces. These advancements can potentially bring marked gains in heterogeneous processor performance and have piqued the interest of researchers who wish to unlock these potential performance gains. Therefore, in this dissertation I explore the heterogeneous CPU-GPU processor and application design space with the goal of answering interesting research questions, such as, (1) what are the architectural design trade-offs in heterogeneous CPU-GPU processors and (2) how do we best maximize heterogeneous CPU-GPU application performance on a given system. To enable my exploration of the heterogeneous CPU-GPU design space, I introduce a novel discrete event-driven simulation library called KnightSim and a novel computer architectural simulator called M2S-CGM. M2S-CGM includes all of the simulation elements necessary to simulate coherent execution between a CPU and GPU with shared LLC and shared virtual memory address spaces. I then utilize M2S-CGM for the conduct of three architectural studies. First, I study the architectural effects of shared LLC and CPU-GPU coherence on the overall performance of non-collaborative GPU-only applications. Second, I profile and analyze a set of collaborative CPU-GPU applications to determine how to best optimize them for maximum collaborative performance. Third, I study the impact of varying four key architectural parameters on collaborative CPU-GPU performance by varying GPU compute unit coalesce size, GPU to memory controller bandwidth, GPU frequency, and system wide switching fabric latency.
Show less - Date Issued
- 2019
- Identifier
- CFE0007807, ucf:52346
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007807
- Title
- On the security of NoSQL cloud database services.
- Creator
-
Ahmadian, Mohammad, Marinescu, Dan, Wocjan, Pawel, Heinrich, Mark, Brennan, Joseph, University of Central Florida
- Abstract / Description
-
Processing a vast volume of data generated by web, mobile and Internet-enabled devices, necessitates a scalable and flexible data management system. Database-as-a-Service (DBaaS) is a new cloud computing paradigm, promising a cost-effective and scalable, fully-managed database functionality meeting the requirements of online data processing. Although DBaaS offers many benefits it also introduces new threats and vulnerabilities. While many traditional data processing threats remain, DBaaS...
Show moreProcessing a vast volume of data generated by web, mobile and Internet-enabled devices, necessitates a scalable and flexible data management system. Database-as-a-Service (DBaaS) is a new cloud computing paradigm, promising a cost-effective and scalable, fully-managed database functionality meeting the requirements of online data processing. Although DBaaS offers many benefits it also introduces new threats and vulnerabilities. While many traditional data processing threats remain, DBaaS introduces new challenges such as confidentiality violation and information leakage in the presence of privileged malicious insiders and adds new dimension to the data security. We address the problem of building a secure DBaaS for a public cloud infrastructure where, the Cloud Service Provider (CSP) is not completely trusted by the data owner. We present a high level description of several architectures combining modern cryptographic primitives for achieving this goal. A novel searchable security scheme is proposed to leverage secure query processing in presence of a malicious cloud insider without disclosing sensitive information. A holistic database security scheme comprised of data confidentiality and information leakage prevention is proposed in this dissertation. The main contributions of our work are:(i) A searchable security scheme for non-relational databases of the cloud DBaaS; (ii) Leakage minimization in the untrusted cloud.The analysis of experiments that employ a set of established cryptographic techniques to protect databases and minimize information leakage, proves that the performance of the proposed solution is bounded by communication cost rather than by the cryptographic computational effort.
Show less - Date Issued
- 2017
- Identifier
- CFE0006848, ucf:51777
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006848
- Title
- Mahalanobis kernel-based support vector data description for detection of large shifts in mean vector.
- Creator
-
Nguyen, Vu, Maboudou, Edgard, Nickerson, David, Schott, James, University of Central Florida
- Abstract / Description
-
Statistical process control (SPC) applies the science of statistics to various process control in order to provide higher-quality products and better services. The K chart is one among the many important tools that SPC offers. Creation of the K chart is based on Support Vector Data Description (SVDD), a popular data classifier method inspired by Support Vector Machine (SVM). As any methods associated with SVM, SVDD benefits from a wide variety of choices of kernel, which determines the...
Show moreStatistical process control (SPC) applies the science of statistics to various process control in order to provide higher-quality products and better services. The K chart is one among the many important tools that SPC offers. Creation of the K chart is based on Support Vector Data Description (SVDD), a popular data classifier method inspired by Support Vector Machine (SVM). As any methods associated with SVM, SVDD benefits from a wide variety of choices of kernel, which determines the effectiveness of the whole model. Among the most popular choices is the Euclidean distance-based Gaussian kernel, which enables SVDD to obtain a flexible data description, thus enhances its overall predictive capability. This thesis explores an even more robust approach by incorporating the Mahalanobis distance-based kernel (hereinafter referred to as Mahalanobis kernel) to SVDD and compare it with SVDD using the traditional Gaussian kernel. Method's sensitivity is benchmarked by Average Run Lengths obtained from multiple Monte Carlo simulations. Data of such simulations are generated from multivariate normal, multivariate Student's (t), and multivariate gamma populations using R, a popular software environment for statistical computing. One case study is also discussed using a real data set received from Halberg Chronobiology Center. Compared to Gaussian kernel, Mahalanobis kernel makes SVDD and thus the K chart significantly more sensitive to shifts in mean vector, and also in covariance matrix.
Show less - Date Issued
- 2015
- Identifier
- CFE0005676, ucf:50170
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005676
- Title
- AN EXPLORATION OF SONG AS A STRATEGY TO ENGAGE ELEMENTARY STUDENTS DURING SOCIAL STUDIES LESSONS.
- Creator
-
Rome, Morgan, Jennings-Towle, Kelly, University of Central Florida
- Abstract / Description
-
The purpose of this thesis is to explore how curriculum-related songs provide an engaging atmosphere for elementary students learning social studies concepts. The investigation done for this thesis examines the resources available to teachers in terms of songs to be used for pedagogical engagement of social studies lessons. Through research and video analyzations it can be concluded that students are overall intrigued by the usage of songs in their social studies lessons. During the social...
Show moreThe purpose of this thesis is to explore how curriculum-related songs provide an engaging atmosphere for elementary students learning social studies concepts. The investigation done for this thesis examines the resources available to teachers in terms of songs to be used for pedagogical engagement of social studies lessons. Through research and video analyzations it can be concluded that students are overall intrigued by the usage of songs in their social studies lessons. During the social studies lessons observed in the video analyzations, the elementary students are focused, exhibit positive body language, participate, and have fun. Since engagement is documented within the analyzed videos and supported through others' research to be beneficial for students, this thesis researched and found a place for songs in elementary social studies lessons. Since there are a lack of current social studies resources that contain a musical element, eight social studies lesson plans were produced specifically for this thesis to demonstrate how songs can be implemented into the elementary curriculum to engage students.
Show less - Date Issued
- 2018
- Identifier
- CFH2000302, ucf:45792
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFH2000302
- Title
- THE INFLUENCE OF STUDENTS'COGNITIVE STYLE ON A STANDARDIZED READING TEST ADMINISTERED IN THREE DIFFERENT FORMATS.
- Creator
-
Blanton, Elizabeth Lynn, Kysilka, Marcella L., University of Central Florida
- Abstract / Description
-
ABSTRACTThe purpose of this study was to examine the means of scores on three forms of a standardized reading comprehension test taken by community college students in developmental reading classes. The three forms of the test were administered as a timed multiple-choice test, a constructed response test, and an un-timed multiple-choice test. Scores on the Group Embedded Figures Test (GEFT) were used to classify the students who participated in the study as having field dependent (LOW GEFT),...
Show moreABSTRACTThe purpose of this study was to examine the means of scores on three forms of a standardized reading comprehension test taken by community college students in developmental reading classes. The three forms of the test were administered as a timed multiple-choice test, a constructed response test, and an un-timed multiple-choice test. Scores on the Group Embedded Figures Test (GEFT) were used to classify the students who participated in the study as having field dependent (LOW GEFT), mid-field dependent/independent (MID GEFT), or field independent (HIGH GEFT) tendencies. The paired samples test was used to analyze the scores among the students classified as LOW GEFT, MID GEFT, and HIGH GEFT for mean differences in scores on the three test formats. The data revealed that for LOW GEFT students, the format of the test impacted their scores, with the mean of the scores of the un-timed multiple-choice test being significantly higher than the timed multiple-choice test and the constructed response format. The data also showed that for the MID GEFT students the mean of the scores for the un-timed multiple-choice test was significantly higher than the means for the timed multiple-choice test scores and the constructed response test scores. However, no significant mean difference was found between the timed multiple-choice test scores and the constructed response test scores. For the HIGH GEFT students, significant mean difference existed only between the un-timed multiple-choice and the timed multiple- choice scores. The means of reading comprehension test scores on the three formats between the LOW GEFT, MID GEFT, and HIGH GEFT students indicated significant mean difference between the timed multiple choice test scores but not between the means of the scores for the constructed response and the un-timed multiple-choice test scores.Demographically, when the means of the reading test scores were analyzed with ethnicity as the controlling variable, the Hispanic students had a significantly higher mean on the scores for the constructed response test format. No other significant mean differences were found between the scores of the African American, Caucasian, Hispanic, or Native American students. When the means of the reading test scores were analyzed with gender as the controlling variable, no significant mean difference was found between the reading comprehension scores of the men and women. This study indicated that cognitive style had more impact on students' performance on a standardized test of reading comprehension than did ethnicity or gender. The un-timed multiple-choice format also had an equalizing effect on the means of the scores for these students.
Show less - Date Issued
- 2004
- Identifier
- CFE0000055, ucf:46085
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000055
- Title
- A FRAMEWORK ROADMAP FOR IMPLEMENTING LEAN SIX SIGMA IN LOCAL GOVERNMENTAL ENTITIES.
- Creator
-
Furterer, Sandra L., Elshennawy, Ahmad K., University of Central Florida
- Abstract / Description
-
Lean Six Sigma is an approach focused on improving quality, reducing variation and eliminating waste in an organization. The concept of combining the principles and tools of Lean Enterprise and Six Sigma has occurred in the literature over the last several years. The majority of Lean Six Sigma applications have been in private industry, focusing mostly on manufacturing applications. The literature has not provided a framework for implementing Lean Six Sigma programs applied to local...
Show moreLean Six Sigma is an approach focused on improving quality, reducing variation and eliminating waste in an organization. The concept of combining the principles and tools of Lean Enterprise and Six Sigma has occurred in the literature over the last several years. The majority of Lean Six Sigma applications have been in private industry, focusing mostly on manufacturing applications. The literature has not provided a framework for implementing Lean Six Sigma programs applied to local government. This research provides a framework roadmap for implementing Lean Six Sigma in local government. The Service Improvement for Transaction-based Entities Lean Six Sigma Framework Roadmap (SITE MAP) identifies the activities, principles, tools, and important component factors to implement Lean Six Sigma. The framework provides a synergistic approach to integrating the concepts and tools of Lean Enterprise and Six Sigma using the DMAIC (Define-Measure-Analyze-Improve-Control) problem solving approach. A case study was used to validate the framework. Lean Six Sigma was successfully applied in a 7,000-citizen municipality to reduce the cycle time of the financial administrative processes in the Finance Department of the city government.
Show less - Date Issued
- 2004
- Identifier
- CFE0000021, ucf:46067
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000021
- Title
- QUANTITATIVE ASSESSMENT OF SOFTWARE DEVELOPMENT PROJECT MANAGEMENT ISSUES USING PROCESS SIMULATION WITH SYSTEM DYNAMICS ELEMENTS.
- Creator
-
Mizell, Carolyn, Malone, Linda, University of Central Florida
- Abstract / Description
-
The complexity of software development projects makes estimation and management very difficult. There is a need for improved cost estimation methods and new models of lifecycle processes other than the common waterfall process. This work has developed a new simulation model of the spiral development lifecycle as well as an approach for using simulation for cost and schedule estimation. The goal is to provide a tool that can analyze the effects of a spiral development process as well as a tool...
Show moreThe complexity of software development projects makes estimation and management very difficult. There is a need for improved cost estimation methods and new models of lifecycle processes other than the common waterfall process. This work has developed a new simulation model of the spiral development lifecycle as well as an approach for using simulation for cost and schedule estimation. The goal is to provide a tool that can analyze the effects of a spiral development process as well as a tool that illustrates the difficulties management faces in forecasting budgets at the beginning of a project which may encourage more realistic approaches to budgetary planning. A new discrete event process model of the incremental spiral development lifecycle approach was developed in order to analyze the effects this development approach has on the estimation process as well as cost and schedule for a project. The input data for the key variables of size, productivity, and defect injection rates in the model was based on analysis of Software Engineering Laboratory data and provided for analysis of the effects of uncertainty in early project estimates. The benefits of combining a separate system dynamics model with a discrete event process models was demonstrated as was the effects of turnover on the cost and schedule for a project. This work includes a major case study of a cancelled NASA software development project that experienced cost and schedule problems throughout its history. Analysis was performed using stochastic simulation with derived probability distributions for key software development factors. A system dynamics model of human resource issues was also combined with the process model to more thoroughly analyze the effects of turnover on a project. This research has demonstrated the benefits of using a simulation model when estimating to allow for more realistic budget and schedule determination including an interval estimate to help focus on the uncertainty of early estimates.
Show less - Date Issued
- 2006
- Identifier
- CFE0001209, ucf:46939
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001209
- Title
- THE BRAGGART SOLDIER: AN ARCHETYPAL CHARACTER FOUND IN "SUNDAY IN THE PARK WITH GEORGE".
- Creator
-
Gebb, Paul, Weaver, Earl, University of Central Florida
- Abstract / Description
-
In preparation for performance, an actor must develop an understanding for the character they portray. A character must be thoroughly researched to adequately enrich the performance of the actor. In preparation for the role of the "Soldier" in the production, Sunday in the Park with George, it is important to examine the evolution of the "Braggart Soldier" archetypal character throughout the historical literary canon. It is also of equal importance to study an author's canon of literature...
Show moreIn preparation for performance, an actor must develop an understanding for the character they portray. A character must be thoroughly researched to adequately enrich the performance of the actor. In preparation for the role of the "Soldier" in the production, Sunday in the Park with George, it is important to examine the evolution of the "Braggart Soldier" archetypal character throughout the historical literary canon. It is also of equal importance to study an author's canon of literature to acknowledge the reoccurring use of similar archetypal characters in order to successfully interpret the intentions of the author. This thesis paper will be divided into four main sections. First, research of the evolution of the "Braggart Soldier" archetypal character from Greek Theater to Contemporary Theater will help to define the character type. Second, historical production research associated with the musical's creation will also provide a deeper insight into the musical's inception. Sunday in the Park with George was based on the painting A Sunday on the Island of La Grande Jatte. Furthermore, a specific focus will be placed on the painting's creation, the background of the Soldier's inclusion in the painting, the musical's collaborative process, and critical responses of the original production. Third, research of four other Stephen Sondheim shows in which similar archetypal characters appear will demonstrate the author's utilization of the character type. The characters referenced from Sondheim's shows will be: Miles Gloriosus from A Funny Thing Happened on the Way to the Forum; Carl Magnus from A Little Night Music; The Princes from Into the Woods; and John Wilkes Booth from Assassins. By studying the scripts and scores of each of these shows, a pattern of character traits will be revealed to enlighten the actor's preparation for the role of the "Soldier" in Sunday in the Park with George. Lastly, an understanding of the musical's overall structure and themes helps to further define the characterization revealed from script and score analysis. This thesis project will contribute to the pre-existing canon of musical theatre research but will also provide insight to non-musical actors who are researching similar archetypal characters. Musical theatre performers who are preparing for Stephen Sondheim shows can apply this research to help understand the role of this archetypal character in the context of each show.
Show less - Date Issued
- 2007
- Identifier
- CFE0001598, ucf:47158
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001598
- Title
- DOUBLE DUTY: PROCESSING AND EXHIBITING THE CHILDREN'S HOME SOCIETY OF FLORIDA COLLECTION AS AN ARCHIVIST AND PUBLIC HISTORIAN.
- Creator
-
Anderson, April, White, Vibert, University of Central Florida
- Abstract / Description
-
The Children's Home Society of Florida, often referred to as "Florida's Greatest Charity", is the state's oldest non profit welfare agency. Founded in 1902, the society was instrumental in creating and reforming child welfare laws as well as helping countless children in the state of Florida find loving homes. This paper focuses on the archival processing of the Children's Home Society of Florida Collection papers and the creation of a subsequent web exhibit. The role of...
Show moreThe Children's Home Society of Florida, often referred to as "Florida's Greatest Charity", is the state's oldest non profit welfare agency. Founded in 1902, the society was instrumental in creating and reforming child welfare laws as well as helping countless children in the state of Florida find loving homes. This paper focuses on the archival processing of the Children's Home Society of Florida Collection papers and the creation of a subsequent web exhibit. The role of archivist and public historian is examined to see how each profession works toward a common goal.
Show less - Date Issued
- 2007
- Identifier
- CFE0001613, ucf:47181
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001613