Current Search: processing (x)
View All Items
Pages
- Title
- Digital Image Processing Using NTEC Facilities.
- Creator
-
Roesch, James F., Simons, Jr., Fred O., Engineering
- Abstract / Description
-
University of Central Florida College of Engineering Thesis; Digital image enhancement refers to the improvement of a given image for human interpretation. Digital image processing facilities are those in which hardware and software computing elements are combined in such a way as to enable the processing of digital images. This report describes the use of the Naval Training Equipment Center (NTEC) Computer Systems Laboratory computing facilities to enhance digital images. Described are two...
Show moreUniversity of Central Florida College of Engineering Thesis; Digital image enhancement refers to the improvement of a given image for human interpretation. Digital image processing facilities are those in which hardware and software computing elements are combined in such a way as to enable the processing of digital images. This report describes the use of the Naval Training Equipment Center (NTEC) Computer Systems Laboratory computing facilities to enhance digital images. Described are two major hardware systems, the IKONAS RDS-3000 raster display graphics system and the VAX-11/780, and the digital image processing program (DIMPRP) written by the author. Digital image enhancement theory and practice are addressed through a discussion of the DIMPRP software. Finally, enhancements to the NTEC digital image processing facility such as improvements in hardware reliability, documentation, and increased speed of program esecution are discussed.
Show less - Date Issued
- 1984
- Identifier
- CFR0008160, ucf:53072
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFR0008160
- Title
- WITHOUT A CAMERA.
- Creator
-
Kulbaba, Brian, Robinson, E. Brady, University of Central Florida
- Abstract / Description
-
The method for creating my art is a matter of experimental process, manipulation of photographic elements, and time spent. I am a photographer in a digital age that does not use a camera. My moment of creativity occurs without the snap of a shutter, but relies on my understanding and control of the chemical components of photography. My work deconstructs the notion of duplication commonly found in photography. The procedure can be repeated but the results are variable. The process of creating...
Show moreThe method for creating my art is a matter of experimental process, manipulation of photographic elements, and time spent. I am a photographer in a digital age that does not use a camera. My moment of creativity occurs without the snap of a shutter, but relies on my understanding and control of the chemical components of photography. My work deconstructs the notion of duplication commonly found in photography. The procedure can be repeated but the results are variable. The process of creating my work often results in a multitude of prints, but the pieces that I select as art capture a number of instinctive characteristics which convey an emotion or message to me. When I present my photographs I offer the viewer an experience--an opportunity to see the work through my mind's eye as it makes sense to me. It is within this open dialogue that the work is complete: part process, part intuitive participation.
Show less - Date Issued
- 2008
- Identifier
- CFE0002100, ucf:47554
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0002100
- Title
- NONPARAMETRIC MULTIVARIATE STATISTICAL PROCESS CONTROL USING PRINCIPAL COMPONENT ANALYSIS AND SIMPLICIAL DEPTH.
- Creator
-
Beltran, Luis, Malone, Linda, University of Central Florida
- Abstract / Description
-
Although there has been progress in the area of Multivariate Statistical Process Control (MSPC), there are numerous limitations as well as unanswered questions with the current techniques. MSPC charts plotting Hotelling's T2 require the normality assumption for the joint distribution among the process variables, which is not feasible in many industrial settings, hence the motivation to investigate nonparametric techniques for multivariate data in quality control. In this research, the goal...
Show moreAlthough there has been progress in the area of Multivariate Statistical Process Control (MSPC), there are numerous limitations as well as unanswered questions with the current techniques. MSPC charts plotting Hotelling's T2 require the normality assumption for the joint distribution among the process variables, which is not feasible in many industrial settings, hence the motivation to investigate nonparametric techniques for multivariate data in quality control. In this research, the goal will be to create a systematic distribution-free approach by extending current developments and focusing on the dimensionality reduction using Principal Component Analysis. The proposed technique is different from current approaches given that it creates a nonparametric control chart using robust simplicial depth ranks of the first and last set of principal components to improve signal detection in multivariate quality control with no distributional assumptions. The proposed technique has the advantages of ease of use and robustness in MSPC for monitoring variability and correlation shifts. By making the approach simple to use in an industrial setting, the probability of adoption is enhanced. Improved MSPC can result in a cost savings and improved quality.
Show less - Date Issued
- 2006
- Identifier
- CFE0001065, ucf:46792
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001065
- Title
- TEN STEP MANUFACTURING PROBLEM SOLVINGPROCESS.
- Creator
-
Panahi, Afsoun, Gonzalez, Fernando, University of Central Florida
- Abstract / Description
-
The ten step problem solving is created to capture and resolve all issues that arise with designing, developing, manufacturing and delivering a new vehicle produce. These steps will provide a common process, which effectively defines and resolves concerns and prevents their recurrence. Step 1: Prepare for the process Step 2: Establish Team Step 3: Describe the Problem Step 4: Develop short term containment action Step 5: Define and verify root cause and escape point Step 6: Choose and verify...
Show moreThe ten step problem solving is created to capture and resolve all issues that arise with designing, developing, manufacturing and delivering a new vehicle produce. These steps will provide a common process, which effectively defines and resolves concerns and prevents their recurrence. Step 1: Prepare for the process Step 2: Establish Team Step 3: Describe the Problem Step 4: Develop short term containment action Step 5: Define and verify root cause and escape point Step 6: Choose and verify permanent corrective actions Step 7: Implement and validate permanent corrective actions Step 8: Prevent recurrence Step 9: Recognize team and individual contributions Step 10: Benchmarking The ten step problem solving process is an enhancement to 6-sigma process that is currently used by many manufacturers. Consumer Driven 6-Sigma is a tool that significantly improves customer satisfaction and shareholder value by reducing variability in every aspect of the business. It builds on existing processes, provides additional tools, and offers a disciplined approach to focus on meeting customer expectations. 6-Sigma helps on finding out where the variability is in a process, and then provides the tools to reduce variability and make the process better.
Show less - Date Issued
- 2006
- Identifier
- CFE0000966, ucf:46707
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000966
- Title
- The generation of synthetic speech sounds by digital coding.
- Creator
-
Steinberger, Eddy Alan, null, null, Engineering
- Abstract / Description
-
FLorida Technological University College of Engineering Thesis; The feasibility of representing human speech by serial digital codes was investigated by exercising specially constructed digital logic coupled with standard audio output equipment. The theories being tested represent a radical departure from previous efforts in the field of speech research. Therefore, this initial investigation was limited in scope to a study of unconnected English language speech sounds at the phenome level.
- Date Issued
- 1975
- Identifier
- CFR0002823, ucf:52917
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFR0002823
- Title
- A Survey of Medical Diagnostic Software.
- Creator
-
Stein, Steven, ,, Engineering
- Abstract / Description
-
Florida Technological University College of Engineering Thesis; The field of medical diagnostic software is reviewed to define its status in the medical profession. This is accomplished in a two-step procedure. The first step is a cross-section of literature on the topic, and the second step is a survey of physicians in a sample area. The cross-section of literature presents some of the more advanced studies which have been conducted on medical diagnostic software. Also presented is an...
Show moreFlorida Technological University College of Engineering Thesis; The field of medical diagnostic software is reviewed to define its status in the medical profession. This is accomplished in a two-step procedure. The first step is a cross-section of literature on the topic, and the second step is a survey of physicians in a sample area. The cross-section of literature presents some of the more advanced studies which have been conducted on medical diagnostic software. Also presented is an explanation of the logic used in diagnostic software and the results of several test cases. The survey was of physicians in the Orlando, Florida area to define the actual application of medical diagnostic software. It presented a sample of physicians' feelings concerning the present use of medical diagnostic software. From these steps, the present status of medical diagnostic software is defined and projections concerning its future made.
Show less - Date Issued
- 1973
- Identifier
- CFR0008127, ucf:52956
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFR0008127
- Title
- Complex-valued adaptive digital signal enhancement for applications in wireless communication systems.
- Creator
-
Liu, Ying, Mikhael, Wasfy, Batarseh, Issa, Yang, Thomas, Hunter, Matthew, Haralambous, Michael, Myers, Brent, University of Central Florida
- Abstract / Description
-
In recent decades, the wireless communication industry has attracted a great deal of research efforts to satisfy rigorous performance requirements and preserve high spectral efficiency. Along with this trend, I/Q modulation is frequently applied in modern wireless communications to develop high performance and high data rate systems. This has necessitated the need for applying efficient complex-valued signal processing techniques to highly-integrated, multi-standard receiver devices.In this...
Show moreIn recent decades, the wireless communication industry has attracted a great deal of research efforts to satisfy rigorous performance requirements and preserve high spectral efficiency. Along with this trend, I/Q modulation is frequently applied in modern wireless communications to develop high performance and high data rate systems. This has necessitated the need for applying efficient complex-valued signal processing techniques to highly-integrated, multi-standard receiver devices.In this dissertation, novel techniques for complex-valued digital signal enhancement are presented and analyzed for various applications in wireless communications. The first technique is a unified block processing approach to generate the complex-valued conjugate gradient Least Mean Square (LMS) techniques with optimal adaptations. The proposed algorithms exploit the concept of the complex conjugate gradients to find the orthogonal directions for updating the adaptive filter coefficients at each iteration. Along each orthogonal direction, the presented algorithms employ the complex Taylor series expansion to calculate time-varying convergence factors tailored for the adaptive filter coefficients. The performance of the developed technique is tested in the applications of channel estimation, channel equalization, and adaptive array beamforming. Comparing with the state of the art methods, the proposed techniques demonstrate improved performance and exhibit desirable characteristics for practical use.The second complex-valued signal processing technique is a novel Optimal Block Adaptive algorithm based on Circularity, OBA-C. The proposed OBA-C method compensates for a complex imbalanced signal by restoring its circularity. In addition, by utilizing the complex Taylor series expansion, the OBA-C method optimally updates the adaptive filter coefficients at each iteration. This algorithm can be applied to mitigate the frequency-dependent I/Q mismatch effects in analog front-end. Simulation results indicate that comparing with the existing methods, OBA-C exhibits superior convergence speed while maintaining excellent accuracy. The third technique is regarding interference rejection in communication systems. The research on both LMS and Independent Component Analysis (ICA) based techniques continues to receive significant attention in the area of interference cancellation. The performance of the LMS and ICA based approaches is studied for signals with different probabilistic distributions. Our research indicates that the ICA-based approach works better for super-Gaussian signals, while the LMS-based method is preferable for sub-Gaussian signals. Therefore, an appropriate choice of interference suppression algorithms can be made to satisfy the ever-increasing demand for better performance in modern receiver design.
Show less - Date Issued
- 2012
- Identifier
- CFE0004572, ucf:49192
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004572
- Title
- CAN GUIDED INQUIRY BASED LABS IMPROVE PERFORMANCE IN DATA ANALYSIS AND CONCLUSION SYNTHESIS IN SIXTH GRADE LIFE SCIENCE?.
- Creator
-
Moore, Melonie, Everett, Robert, University of Central Florida
- Abstract / Description
-
Desiring to examine the performance of science process skills such as data analysis and conclusion synthesis in sixth grade Life Science students, I used an inquiry strategy called "guided inquiry" in a series of six laboratory assignments during the normal county-mandated order of instruction for Life Science. I based my analysis upon these laboratory exercises, a survey of student attitudes towards science done before the study began and after the study completed, an assessment of inquiry...
Show moreDesiring to examine the performance of science process skills such as data analysis and conclusion synthesis in sixth grade Life Science students, I used an inquiry strategy called "guided inquiry" in a series of six laboratory assignments during the normal county-mandated order of instruction for Life Science. I based my analysis upon these laboratory exercises, a survey of student attitudes towards science done before the study began and after the study completed, an assessment of inquiry understanding done before and after the study was finished, routine material tests, and a science final class evaluation done after the study was finished. Emphasis was placed upon examining the content of the laboratory reports which required students to analyze their experiments and draw a conclusion based upon their findings. The study found that while most students did grasp the desired scientific principles the labs were designed to teach, they had difficulty in formulating a structured and detailed account of their experiences without guidance. The study helped to further understanding of student performance and learning in science process skills such as data analysis and conclusion synthesis.
Show less - Date Issued
- 2009
- Identifier
- CFE0002807, ucf:48123
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0002807
- Title
- ANALYSIS OF KOLMOGOROV'S SUPERPOSITION THEOREM AND ITS IMPLEMENTATION IN APPLICATIONS WITH LOW AND HIGH DIMENSIONAL DATA.
- Creator
-
Bryant, Donald, Li, Xin, University of Central Florida
- Abstract / Description
-
In this dissertation, we analyze Kolmogorov's superposition theorem for high dimensions. Our main goal is to explore and demonstrate the feasibility of an accurate implementation of Kolmogorov's theorem. First, based on Lorentz's ideas, we provide a thorough discussion on the proof and its numerical implementation of the theorem in dimension two. We present computational experiments which prove the feasibility of the theorem in applications of low dimensions (namely, dimensions...
Show moreIn this dissertation, we analyze Kolmogorov's superposition theorem for high dimensions. Our main goal is to explore and demonstrate the feasibility of an accurate implementation of Kolmogorov's theorem. First, based on Lorentz's ideas, we provide a thorough discussion on the proof and its numerical implementation of the theorem in dimension two. We present computational experiments which prove the feasibility of the theorem in applications of low dimensions (namely, dimensions two and three). Next, we present high dimensional extensions with complete and detailed proofs and provide the implementation that aims at applications with high dimensionality. The amalgamation of these ideas is evidenced by applications in image (two dimensional) and video (three dimensional) representations, the content based image retrieval, video retrieval, de-noising and in-painting, and Bayesian prior estimation of high dimensional data from the fields of computer vision and image processing.
Show less - Date Issued
- 2008
- Identifier
- CFE0002236, ucf:47909
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0002236
- Title
- Digital Image Processing by the Two-Dimensional Discrete Fourier Transform Method.
- Creator
-
Joels, Lyman F., null, null, Engineering
- Abstract / Description
-
Florida Technological University College of Engineering Thesis; The present study was conducted to ascertain undergraduate views about the effectiveness of International Teaching Assistants (ITAs) in the American classroom. The study was administered to a stratified cluster sampling by college of the target population, undergraduate students at the University of Central Florida, in Orlando, Florida. The instrument used, Questionnaire of Undergraduates about International Teaching Assistants ...
Show moreFlorida Technological University College of Engineering Thesis; The present study was conducted to ascertain undergraduate views about the effectiveness of International Teaching Assistants (ITAs) in the American classroom. The study was administered to a stratified cluster sampling by college of the target population, undergraduate students at the University of Central Florida, in Orlando, Florida. The instrument used, Questionnaire of Undergraduates about International Teaching Assistants (QUITA) as developed by Wanda Fox (1990), is composed of a total of 40 items regarding personal and academic background, cultural exposure to and views about non-native speakers of English, and ITA-classroom effectiveness and problem-solving strategies. On the basis of data from the Fall 1998 semester, approximately 15% of the total number of ITA-taught course sections per college were surveyed. The subjects responded anonymously using computerized answer sheets. Upon completion of the data collection phase, all surveys were analyzed for response frequencies. In addition, background and demographic information regarding the participants and information regarding undergraduate exposure to IT As and IT A instruction were also summarized. The Likert-type items were combined to reveal an overall ATITA (Attitude toward International Teaching Assistants) score. The results of the ATITA portion of the study indicate that undergraduate student views toward IT As and IT A instruction are between neutral and mildly positive. Furthermore, survey responses indicated that undergraduates resolve conflicts involving IT As through personal means. The closing recommendations suggest maintaining open lines of communication between undergraduates, ITAs, and administrators alike.
Show less - Date Issued
- 1973
- Identifier
- CFR0004782, ucf:52960
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFR0004782
- Title
- Speech Synthesis Utilizing Microcomputer Control.
- Creator
-
Uzel, Joseph N., Patz, Benjamin W., Engineering
- Abstract / Description
-
Florida Technological University College of Engineering Thesis; This report explores the subject of speech synthesis. Information given includes a brief explanation of speech production in man, an historical view of speech synthesis, and four types of electronic synthesizers in use today. Also included is a brief presentation on phonetics, the study of speech sounds. An understanding of this subject is necessary to see how a synthesizer must produce certain sounds, and how these sounds are...
Show moreFlorida Technological University College of Engineering Thesis; This report explores the subject of speech synthesis. Information given includes a brief explanation of speech production in man, an historical view of speech synthesis, and four types of electronic synthesizers in use today. Also included is a brief presentation on phonetics, the study of speech sounds. An understanding of this subject is necessary to see how a synthesizer must produce certain sounds, and how these sounds are put together to create words. Finally a description of a limited text speech synthesizer is presented. This system allows the user to enter English text via a keyboard and have it output in spoken form. The future of speech synthesis appears to be very bright. This report also gives some possible applications of verbal computer communication.
Show less - Date Issued
- 1978
- Identifier
- CFR0004781, ucf:52972
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFR0004781
- Title
- El efecto de la variaci(&)#243;n dialectal en el procesamiento.
- Creator
-
Schanze, Kirsten, Villegas, Alvaro, Nalbone, Lisa, Fernandez-Rubiera, Francisco, University of Central Florida
- Abstract / Description
-
Lexical variation, or the existence of multiple lexemes that can be used to denote a particular concept, is a phenomenon characteristic of most of the world's language systems. Often times the source of this variation is difficult to determine, with a variety of inter- and intra-linguistic factors at play. This thesis was conducted with three main goals: 1) to delineate lexical items typical to specific dialects of Spanish and generate country-specific word lists that focus on salient...
Show moreLexical variation, or the existence of multiple lexemes that can be used to denote a particular concept, is a phenomenon characteristic of most of the world's language systems. Often times the source of this variation is difficult to determine, with a variety of inter- and intra-linguistic factors at play. This thesis was conducted with three main goals: 1) to delineate lexical items typical to specific dialects of Spanish and generate country-specific word lists that focus on salient contrasts between the different varieties of the language; 2) to determine whether speakers of particular varieties of Spanish, namely Puerto Rican and Venezuelan Spanish, were able to recognize lexical items that are supposedly characteristic of their dialect in particular; 3) to examine how dialectal variation can affect linguistic processing. The first part of this investigation examined the relative frequency of use of 1,903 dialectal words in the 22 countries contained within Corpus de Referencia del Espa(&)#241;ol Actual, or CREA, (REAL ACADEMIA ESPA(&)#209;OLA, 2008). Of these 1,903 words, total of 320 were found to be characteristic of a particular variety of Spanish. The lexical items that demonstrated significant correlation with Puerto Rican and Venezuelan Spanish were then used to develop a picture naming task in which participants were asked to designate whether a particular lexical item constituted an appropriate label for the image depicted. The results from this study suggest that speakers of these two dialects were unable to distinguish words as pertaining to their variety in particular, regardless of the supposed high frequency of use within their dialect. The present study thus theorizes that the processing of these dialectal lexical items is closer to monolingual models rather than bilingual models as bilingual-like behaviors were not observed.
Show less - Date Issued
- 2016
- Identifier
- CFE0006176, ucf:51131
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006176
- Title
- AN ANALYSIS OF ACCREDITATION PROCESSES, QUALITY CONTROL CRITERIA, HISTORICAL EVENTS, AND STUDENT PERFORMANCE.
- Creator
-
Burris, Robert, Murray, Barbara, University of Central Florida
- Abstract / Description
-
The purpose of this study was to determine to what extent student performance has been influenced by historical events, legislative mandates, and accreditation processes. This study consists of comparing the Southern Association of Colleges and Schools accreditation processes with those of the Association of Christian Schools International. In completing this qualitative study, the following procedures were implemented: Related research was used to provide a background of the role that...
Show moreThe purpose of this study was to determine to what extent student performance has been influenced by historical events, legislative mandates, and accreditation processes. This study consists of comparing the Southern Association of Colleges and Schools accreditation processes with those of the Association of Christian Schools International. In completing this qualitative study, the following procedures were implemented: Related research was used to provide a background of the role that historical events, legislation, and accreditation processes have on student performance; data were collected to establish time line shifts in an historical perspective. The data collected included assessment, accountability, high school drop out rates, high school graduation rates, academic readiness for higher education, standardized testing, grade inflation, acceleration of dual enrollment and advanced placement courses, and national SAT and ACT averages. Data were also collected from historical record of accreditation processes, which included standards, teacher certification requirements, committee responsibilities, visiting team responsibilities, and self-study materials. As a result of content analysis, the researcher decided to focus on three key areas that were integral to the study. The three categories identified in the review of literature were used to analyze the content of these events and processes. The categories were: (a) Student Performance, (b) Historical Events, and (c) SACS and ACSI Accreditation Processes. The following results were obtained from this research. Findings indicated that a criterion-based accreditation process potentially results in more consistent student performance outcomes than an open-ended process.
Show less - Date Issued
- 2008
- Identifier
- CFE0002052, ucf:47569
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0002052
- Title
- FORENSIC ANALYSIS OF C-4 AND COMMERCIAL BLASTING AGENTS FOR POSSIBLE DISCRIMINATION.
- Creator
-
Steele, Katie, Sigman, Michael, University of Central Florida
- Abstract / Description
-
The criminal use of explosives has increased in recent years. Political instability and the wide spread access to the internet, filled with "homemade recipes," are two conjectures for the increase. C-4 is a plastic bonded explosive (PBX) comprised of 91% of the high explosive RDX, 1.6% processing oils, 5.3% plasticizer, and 2.1% polyisobutylene (PIB). C-4 is most commonly used for military purposes, but also has found use in commercial industry as well. Current methods for the forensic...
Show moreThe criminal use of explosives has increased in recent years. Political instability and the wide spread access to the internet, filled with "homemade recipes," are two conjectures for the increase. C-4 is a plastic bonded explosive (PBX) comprised of 91% of the high explosive RDX, 1.6% processing oils, 5.3% plasticizer, and 2.1% polyisobutylene (PIB). C-4 is most commonly used for military purposes, but also has found use in commercial industry as well. Current methods for the forensic analysis of C-4 are limited to identification of the explosive; however, recent publications have suggested the plausibility of discrimination between C-4 samples based upon the processing oils and stable isotope ratios. This research focuses on the discrimination of C-4 samples based on ratios of RDX to HMX, a common impurity resulting from RDX synthesis. The relative amounts of HMX are a function of the RDX synthetic route and conditions. RDX was extracted from different C-4 samples and was analyzed by ESI-MS-SIM as the chloride adduct, EI-GC-MS-SIM, and NICI-GC-MS. Ratios (RDX/HMX) were calculated for each method. An analysis of variance (ANOVA) followed by a Tukey HSD allowed for an overall discriminating power to be assessed for each analytical method. The C-4 processing oils were also extracted, and analyzed by direct exposure probe mass spectrometry (DEP-MS) with electron ionization, a technique that requires less than two minutes for analysis. The overall discriminating power of the processing oils was calculated by conducting a series of t tests. Lastly, a set of heterogeneous commercial blasting agents were analyzed by laser induced breakdown spectroscopy (LIBS). The data was analyzed by principal components analysis (PCA), and the possibility of creating a searchable library was explored.
Show less - Date Issued
- 2007
- Identifier
- CFE0001805, ucf:47358
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001805
- Title
- PROCESS AND MIND: EXPLORING THE RELATIONSHIP BETWEEN PROCESS PHILOSOPHY AND THE NONLINEAR DYNAMICAL SYSTEMS SCIENCE OF COGNITION.
- Creator
-
Moralez, Larry A, Favela, Luis H., University of Central Florida
- Abstract / Description
-
This work examines the relationship between Alfred North Whitehead's process philosophy and the nonlinear dynamical systems framework for studying cognition. I argue that the nonlinear dynamical systems approach to cognitive science presupposes many key elements of his process philosophy. The process philosophical interpretation of nature posits events and the dynamic relations between events as the fundamental substrate of reality, as opposed to static physical substances. I present a brief...
Show moreThis work examines the relationship between Alfred North Whitehead's process philosophy and the nonlinear dynamical systems framework for studying cognition. I argue that the nonlinear dynamical systems approach to cognitive science presupposes many key elements of his process philosophy. The process philosophical interpretation of nature posits events and the dynamic relations between events as the fundamental substrate of reality, as opposed to static physical substances. I present a brief history of the development of substance thought before describing Whitehead's characterization of nature as a process. In following, I will examine the both the computational and nonlinear dynamical systems frameworks for investigating cognition. I will show that the computational paradigm is subject to many of the same criticisms as substance. Conversely, I will show that nonlinear dynamical cognitive science avoids these criticisms and is congenial to Whitehead's philosophy insofar as it is suitable for describing emergent processes. To conclude, I suggest that the nonlinear dynamical cognitive science confirms and validates Whitehead's philosophy. Furthermore, I argue that process philosophy is an appropriate characterization of nature for guiding inquiry in cognitive science.
Show less - Date Issued
- 2016
- Identifier
- CFH2000091, ucf:45553
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFH2000091
- Title
- SEMANTIC BIAS AS AN APPLICATION OF THE UNIVERSAL GRAMMAR MODEL IN THE RUSSIAN LANGUAGE.
- Creator
-
Gural, Iryna, Modianos, Doan T., Villegas, Alvaro, University of Central Florida
- Abstract / Description
-
The theory of the Universal Grammar developed by Chomsky has been known for many years. The main idea behind the theory was that the processing of the language does not depend on the culture but it universal among all the languages. Further psycholinguistic studies developed the ideas about schematic comprehension of the language, giving rise to the idea of the "garden path effect". Research focused on the processing of the ambiguous sentences and found the tendency for readers to prefer...
Show moreThe theory of the Universal Grammar developed by Chomsky has been known for many years. The main idea behind the theory was that the processing of the language does not depend on the culture but it universal among all the languages. Further psycholinguistic studies developed the ideas about schematic comprehension of the language, giving rise to the idea of the "garden path effect". Research focused on the processing of the ambiguous sentences and found the tendency for readers to prefer interpretations of specific sentence areas as objects. The current study summarizes the ideas of psycholinguistic study and incorporates a novel language structure to study readers' syntactic preferences. In addition, conducting the study in Russian language accompanies previous research in other languages, also arguing in favor of the Universal Grammar model given the hypothesis was supported. It was hypothesized that readers would prefer the comparison of the two direct objects over the subjects, which would be reflected by faster reading times. Self-paced reading ask was administered to the participants in order to measure their reading times. The analysis found no significant differences in the reading times of the critical area, thus hypothesis was not supported. Possible explanations, limitations, and further directions are discussed.
Show less - Date Issued
- 2019
- Identifier
- CFH2000513, ucf:45697
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFH2000513
- Title
- Mathematical Foundations of Adaptive Quantum Processing.
- Creator
-
Bonior, Daniel, Mucciolo, Eduardo, Martin, Keye, Argenti, Luca, Shivamoggi, Bhimsen, Marinescu, Dan, University of Central Florida
- Abstract / Description
-
Quantum information has the potential to revolutionize the way we store, process, transfer and acquire information [1,14,15,21,37]. In particular, quantum information offers exciting new approaches to secure communication, computation and sensing. However, in order to realize such technologies, we must first understand the effect that environmental noise has on a quantum system. This dissertation builds upon recent studies that have explored the underlying structure of quantum information and...
Show moreQuantum information has the potential to revolutionize the way we store, process, transfer and acquire information [1,14,15,21,37]. In particular, quantum information offers exciting new approaches to secure communication, computation and sensing. However, in order to realize such technologies, we must first understand the effect that environmental noise has on a quantum system. This dissertation builds upon recent studies that have explored the underlying structure of quantum information and the effects of qubit channels in quantum communication protocols.This work is divided into five main chapters, with Chapter 1 being a brief introduction to quantum information. We then begin Chapter 2 by defining the error function for our qubit communication protocols. From there we explore the properties of our error functions and the topological space that they form. In Chapter 3 we consider the newly patented process Adaptive Quantum Information Processing, patent number US9838141 B2; originally outlined by Martin in [23]. We restate the adaptive scheme and exemplify its application through the Prepare and Send Protocol and Quantum Key Distribution. Applying our results from Chapter 2, we obtain an expression for the adaptability of unital channels in these two protocols and classify the channels that admit the most improvement. We dedicate Chapter 4 to the derivation of gravitational noise, and show that in certain circumstances gravity results in a channel that can be maximally improved in Adaptive QKD [3,14,16]. Lastly, we study the set of error functions through the lens of domain theory. Domain theory is a subset of mathematics that was developed in order to rigorously formalize computations. The first four chapters are all consequences of past discoveries in the mathematical structure of quantum channels. In Chapter 5 we characterize the set of error functions through domain theory, extending the mathematical foundations of quantum information. [12,18,20, 22, 23,25].
Show less - Date Issued
- 2018
- Identifier
- CFE0007313, ucf:52124
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007313
- Title
- Tessellation for computer image generation.
- Creator
-
Panzitta, Michael James, Bauer, Christian S., Engineering
- Abstract / Description
-
University of Central Florida College of Engineering Thesis; Of the vast number of algorithms used in modern computer image generation, most rely upon data bases comprised of polygons. This becomes a severe impediment when curved objects must be modeled and displayed with an acceptable level of speed and accuracy. A technique is needed to provide a means of modeling curved surfaces, storing them in a data base, and displaying them using existing algorithms. Tessellation is one method of...
Show moreUniversity of Central Florida College of Engineering Thesis; Of the vast number of algorithms used in modern computer image generation, most rely upon data bases comprised of polygons. This becomes a severe impediment when curved objects must be modeled and displayed with an acceptable level of speed and accuracy. A technique is needed to provide a means of modeling curved surfaces, storing them in a data base, and displaying them using existing algorithms. Tessellation is one method of achieving such goals.
Show less - Date Issued
- 1987
- Identifier
- CFR0001375, ucf:52922
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFR0001375
- Title
- Detection of DDH in Infants and Children Using Audible Acoustics.
- Creator
-
Hassan, Tanvir, Mansy, Hansen, Song, Sang-Eun, Kassab, Alain, University of Central Florida
- Abstract / Description
-
Detection of developmental dysplasia of the hip (DDH) in infants and children is important as it leads to permanent hip instability. Current methods for detecting DDH, such as ultrasound and x-rays, are relatively expensive and need qualified medical personnel to administer the test. Furthermore, x-ray ionizing radiation can have potential harmful effects. In the current study, an acoustic non-invasive and simple approach was investigated for detection of DDH. Different benchtop simplified...
Show moreDetection of developmental dysplasia of the hip (DDH) in infants and children is important as it leads to permanent hip instability. Current methods for detecting DDH, such as ultrasound and x-rays, are relatively expensive and need qualified medical personnel to administer the test. Furthermore, x-ray ionizing radiation can have potential harmful effects. In the current study, an acoustic non-invasive and simple approach was investigated for detection of DDH. Different benchtop simplified models and pig models were constructed and tested. Models were stimulated with band-limited white acoustic noise (10-2500 Hz) and the response of the models was measured. The power spectrum density, transfer function, and coherence were determined for different hip dysplasia levels and for normal cases. Results showed that the power spectrum density, transfer function, and coherence were affected by dysplasia occurrence. Effects appear larger for more severe dysplastic hips. This suggests that the proposed approach may have potential for DDH detection.
Show less - Date Issued
- 2019
- Identifier
- CFE0007816, ucf:52350
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007816
- Title
- Integrating Multiobjective Optimization with the Six Sigma Methodology for Online Process Control.
- Creator
-
Abualsauod, Emad, Geiger, Christopher, Elshennawy, Ahmad, Thompson, William, Moore, Karla, University of Central Florida
- Abstract / Description
-
Over the past two decades, the Define-Measure-Analyze-Improve-Control (DMAIC) framework of the Six Sigma methodology and a host of statistical tools have been brought to bear on process improvement efforts in today's businesses. However, a major challenge of implementing the Six Sigma methodology is maintaining the process improvements and providing real-time performance feedback and control after solutions are implemented, especially in the presence of multiple process performance objectives...
Show moreOver the past two decades, the Define-Measure-Analyze-Improve-Control (DMAIC) framework of the Six Sigma methodology and a host of statistical tools have been brought to bear on process improvement efforts in today's businesses. However, a major challenge of implementing the Six Sigma methodology is maintaining the process improvements and providing real-time performance feedback and control after solutions are implemented, especially in the presence of multiple process performance objectives. The consideration of a multiplicity of objectives in business and process improvement is commonplace and, quite frankly, necessary. However, balancing the collection of objectives is challenging as the objectives are inextricably linked, and, oftentimes, in conflict.Previous studies have reported varied success in enhancing the Six Sigma methodology by integrating optimization methods in order to reduce variability. These studies focus these enhancements primarily within the Improve phase of the Six Sigma methodology, optimizing a single objective. The current research and practice of using the Six Sigma methodology and optimization methods do little to address the real-time feedback and control for online process control in the case of multiple objectives.This research proposes an innovative integrated Six Sigma multiobjective optimization (SSMO) approach for online process control. It integrates the Six Sigma DMAIC framework with a nature-inspired optimization procedure that iteratively perturbs a set of decision variables providing feedback to the online process, eventually converging to a set of tradeoff process configurations that improves and maintains process stability. For proof of concept, the approach is applied to a general business process model (-) a well-known inventory management model (-) that is formally defined and specifies various process costs as objective functions. The proposed SSMO approach and the business process model are programmed and incorporated into a software platform. Computational experiments are performed using both three sigma (3?)-based and six sigma (6?)-based process control, and the results reveal that the proposed SSMO approach performs far better than the traditional approaches in improving the stability of the process. This research investigation shows that the benefits of enhancing the Six Sigma method for multiobjective optimization and for online process control are immense.
Show less - Date Issued
- 2013
- Identifier
- CFE0004968, ucf:49561
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004968