Current Search: process (x)
Pages
-
-
Title
-
MODELING AND DESIGN OF A PHOTONIC CRYSTAL CHIP HOSTING A QUANTUM NETWORK MADE OF SINGLE SPINS IN QUANTUM DOTS THAT INTERACT VIA SINGLE PHOTONS.
-
Creator
-
Seigneur, Hubert, Schoenfeld, Winston, University of Central Florida
-
Abstract / Description
-
In this dissertation, the prospect of a quantum technology based on a photonic crystal chip hosting a quantum network made of quantum dot spins interacting via single photons is investigated. The mathematical procedure to deal with the Liouville-Von Neumann equation, which describes the time-evolution of the density matrix, was derived for an arbitrary system, giving general equations. Using this theoretical groundwork, a numerical model was then developed to study the spatiotemporal dynamics...
Show moreIn this dissertation, the prospect of a quantum technology based on a photonic crystal chip hosting a quantum network made of quantum dot spins interacting via single photons is investigated. The mathematical procedure to deal with the Liouville-Von Neumann equation, which describes the time-evolution of the density matrix, was derived for an arbitrary system, giving general equations. Using this theoretical groundwork, a numerical model was then developed to study the spatiotemporal dynamics of entanglement between various qubits produced in a controlled way over the entire quantum network. As a result, an efficient quantum interface was engineered allowing for storage qubits and traveling qubits to exchange information coherently while demonstrating little error and loss in the process; such interface is indispensable for the realization of a functional quantum network. Furthermore, a carefully orchestrated dynamic control over the propagation of the flying qubit showed high-efficiency capability for on-chip single-photon transfer. Using the optimized dispersion properties obtained quantum mechanically as design parameters, a possible physical structure for the photonic crystal chip was constructed using the Plane Wave Expansion and Finite-Difference Time-Domain numerical techniques, exhibiting almost identical transfer efficiencies in terms of normalized energy densities of the classical electromagnetic field. These promising results bring us one step closer to the physical realization of an integrated quantum technology combining both semiconductor quantum dots and sub-wavelength photonic structures.
Show less
-
Date Issued
-
2010
-
Identifier
-
CFE0003433, ucf:48391
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0003433
-
-
Title
-
SOURCE REPRESENTATION AND FRAMING IN CHILDHOOD IMMUNIZATION COMMUNICATION.
-
Creator
-
Raneri, April, Matusitz, Jonathan, University of Central Florida
-
Abstract / Description
-
Research has indicated a strong interest in knowing who is being represented and how information is being represented in the communication about childhood immunization. This study uses a two-part analysis to look at source representation and framing in childhood immunization communication. A quantitative analysis of articles from the New York Times and USA Today were examined for their source representation, their use of fear appeals, through the Extended Parallel Processing Model (EPPM), and...
Show moreResearch has indicated a strong interest in knowing who is being represented and how information is being represented in the communication about childhood immunization. This study uses a two-part analysis to look at source representation and framing in childhood immunization communication. A quantitative analysis of articles from the New York Times and USA Today were examined for their source representation, their use of fear appeals, through the Extended Parallel Processing Model (EPPM), and the use of frames, through the application of Prospect Theory. A qualitative semiotic analysis was conducted on 36 images that appeared on www.yahoo.com and www.google.com to find common themes for who is being represented and how information is being portrayed through the images. Results found a high prevalence of representation from the Center for Disease Control and Prevention, other governmental agencies and views from health/medical professionals in both the articles and images.
Show less
-
Date Issued
-
2010
-
Identifier
-
CFE0003016, ucf:48343
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0003016
-
-
Title
-
MODELING CARBON ALLOCATION, GROWTH AND RECOVERY IN SCRUB OAKS EXPERIENCING ABOVEGROUND DISTURBANCE.
-
Creator
-
Seiler, Troy, Weishampel, John, University of Central Florida
-
Abstract / Description
-
Allocation of assimilated carbon amongst plant metabolic processes and tissues is important to understanding ecosystem carbon cycles. Due to the range of spatio-temporal scales and complex process interactions involved, direct measurements of allocation in natural environments are logistically difficult. Modeling approaches provide tools to examine these patterns by integrating finer scale process measurements. One such method is root:shoot balance, where plant growth is limited by either...
Show moreAllocation of assimilated carbon amongst plant metabolic processes and tissues is important to understanding ecosystem carbon cycles. Due to the range of spatio-temporal scales and complex process interactions involved, direct measurements of allocation in natural environments are logistically difficult. Modeling approaches provide tools to examine these patterns by integrating finer scale process measurements. One such method is root:shoot balance, where plant growth is limited by either shoot activity (i.e. photosynthesis) or root activity (i.e. water and nutrient uptake). This method shows promise for application on frequently disturbed systems which perturb aboveground biomass and thus create imbalances in root and shoot activities. In this study, root:shoot balance, allometric relationships and phenological patterns were used to model carbon allocation and growth in Florida scrub oaks. The model was tested using ecosystem gas exchange (i.e. eddy covariance) and meteorological data from two independent sites at Merritt Island National Wildlife Refuge, FL which experienced two different types of disturbance events: a prescribed burn in 2006 and wind damage from Hurricane Frances in 2004. The effects of the two disturbance events, which differed greatly in magnitude and impact, were compared to identify similarities and differences in plant allocation response. Model results and process-based sensitivity analysis demonstrated the strong influence of autotrophic respiration on plant growth and allocation processes. Also, fine root dynamics were found to dominate partitioning trends of carbon allocated to growth. Overall, model results aligned well with observed biomass trends, with some discrepancies that suggest fine root turnover to be more dynamic than currently parameterized in the model. This modeling approach can be extended through the integration with more robust process models, for example, mechanistic photosynthesis, nitrogen uptake and/or dynamic root turnover models.
Show less
-
Date Issued
-
2011
-
Identifier
-
CFE0003664, ucf:48819
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0003664
-
-
Title
-
From Dude to Dad: A Study on Prenatal Fatherhood and its Representation in Theatre.
-
Creator
-
Nilsson, Michael, Thomas, Aaron, Horn, Elizabeth, Reed, David, Niess, Christopher, University of Central Florida
-
Abstract / Description
-
A man in the preparatory phase for parenthood with his first child can go through a lot of extreme emotional highs and lows, depending upon the man's economic, relationship, and physical status, as well as community pressures and support. In preparation to portray an array of prenatal fathers in a showcase of scenes, I have read an assortment of plays and scholarly sources. In reading a large sample of prenatal plays, I have extracted several possible emotional changes within a man's psyche....
Show moreA man in the preparatory phase for parenthood with his first child can go through a lot of extreme emotional highs and lows, depending upon the man's economic, relationship, and physical status, as well as community pressures and support. In preparation to portray an array of prenatal fathers in a showcase of scenes, I have read an assortment of plays and scholarly sources. In reading a large sample of prenatal plays, I have extracted several possible emotional changes within a man's psyche. I also analyzed the social rationale behind these changes through the writings of sociologists and other scholarly sources. In addition to this research, I was going through my journey toward parenthood at the initiation of this research, as my child was born half way into the project. With the exploration of theatrical literature and sociological research paired with my personal experience of going through the prenatal phase, I have documented the changes a man may experience in his emotional growth. This time is full of differing anxieties that spring from the anticipation of change, while a man is preparing for parenthood. Through the medium of a showcase of theatrical scenes that are representations of the prenatal father, I explore the emotional journeys of several of these men and document my findings. As actors in theatre, we use the emotional life of characters to enlighten our choices in actions and tactics. These tactics are in service to the selfish goals we have as characters. The emotions the character has may act as either a driving force or an obstacle in obtaining our goals. When exploring the emotions of a pre-paternal man, one must consider all the variables in the creation of these emotions. In this project, I extract the emotions that a prenatal father may be vulnerable to and document for personal use as an actor presenting pre-paternal characters.
Show less
-
Date Issued
-
2017
-
Identifier
-
CFE0006771, ucf:51835
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0006771
-
-
Title
-
Automatically Acquiring a Semantic Network of Related Concepts.
-
Creator
-
Szumlanski, Sean, Gomez, Fernando, Wu, Annie, Hughes, Charles, Sims, Valerie, University of Central Florida
-
Abstract / Description
-
We describe the automatic acquisition of a semantic network in which over 7,500 of the most frequently occurring nouns in the English language are linked to their semantically related concepts in the WordNet noun ontology. Relatedness between nouns is discovered automatically from lexical co-occurrence in Wikipedia texts using a novel adaptation of an information theoretic inspired measure. Our algorithm then capitalizes on salient sense clustering among these semantic associates to...
Show moreWe describe the automatic acquisition of a semantic network in which over 7,500 of the most frequently occurring nouns in the English language are linked to their semantically related concepts in the WordNet noun ontology. Relatedness between nouns is discovered automatically from lexical co-occurrence in Wikipedia texts using a novel adaptation of an information theoretic inspired measure. Our algorithm then capitalizes on salient sense clustering among these semantic associates to automatically disambiguate them to their corresponding WordNet noun senses (i.e., concepts). The resultant concept-to-concept associations, stemming from 7,593 target nouns, with 17,104 distinct senses among them, constitute a large-scale semantic network with 208,832 undirected edges between related concepts. Our work can thus be conceived of as augmenting the WordNet noun ontology with RelatedTo links.The network, which we refer to as the Szumlanski-Gomez Network (SGN), has been subjected to a variety of evaluative measures, including manual inspection by human judges and quantitative comparison to gold standard data for semantic relatedness measurements. We have also evaluated the network's performance in an applied setting on a word sense disambiguation (WSD) task in which the network served as a knowledge source for established graph-based spreading activation algorithms, and have shown: a) the network is competitive with WordNet when used as a stand-alone knowledge source for WSD, b) combining our network with WordNet achieves disambiguation results that exceed the performance of either resource individually, and c) our network outperforms a similar resource, WordNet++ (Ponzetto (&) Navigli, 2010), that has been automatically derived from annotations in the Wikipedia corpus.Finally, we present a study on human perceptions of relatedness. In our study, we elicited quantitative evaluations of semantic relatedness from human subjects using a variation of the classical methodology that Rubenstein and Goodenough (1965) employed to investigate human perceptions of semantic similarity. Judgments from individual subjects in our study exhibit high average correlation to the elicited relatedness means using leave-one-out sampling (r = 0.77, ? = 0.09, N = 73), although not as high as average human correlation in previous studies of similarity judgments, for which Resnik (1995) established an upper bound of r = 0.90 (? = 0.07, N = 10). These results suggest that human perceptions of relatedness are less strictly constrained than evaluations of similarity, and establish a clearer expectation for what constitutes human-like performance by a computational measure of semantic relatedness. We also contrast the performance of a variety of similarity and relatedness measures on our dataset to their performance on similarity norms and introduce our own dataset as a supplementary evaluative standard for relatedness measures.
Show less
-
Date Issued
-
2013
-
Identifier
-
CFE0004759, ucf:49767
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0004759
-
-
Title
-
A New Paradigm Integrating Business Process Modeling and Use Case Modeling.
-
Creator
-
Brown, Barclay, Karwowski, Waldemar, Thompson, William, Lee, Gene, O'Neal, Thomas, University of Central Florida
-
Abstract / Description
-
The goal of this research is to develop a new paradigm integrating the practices of business process modeling and use case modeling. These two modeling approaches describe the behavior of organizations and systems, and their interactions, but rest on different paradigms and serve different needs. The base of knowledge and information required for each approach is largely common, however, so an integrated approach has advantages in efficiency, consistency and completeness of the overall...
Show moreThe goal of this research is to develop a new paradigm integrating the practices of business process modeling and use case modeling. These two modeling approaches describe the behavior of organizations and systems, and their interactions, but rest on different paradigms and serve different needs. The base of knowledge and information required for each approach is largely common, however, so an integrated approach has advantages in efficiency, consistency and completeness of the overall behavioral model. Both modeling methods are familiar and widely used. Business process modeling is often employed as a precursor to the development of a system to be used in a business organization. Business process modeling teams and stakeholders may spend months or years developing detailed business process models, expecting that these models will provide a useful base of information for system designers. Unfortunately, as the business process model is analyzed by the system designers, it is found that information needed to specify the functionality of the system does not exist in the business process model. System designers may then employ use case modeling to specify the needed system functionality, again spending significant time with stakeholders to gather the needed input. Stakeholders find this two-pass process redundant and wasteful of time and money since the input they provide to both modeling teams is largely identical, with each team capturing only the aspects relevant to their form of modeling. Developing a new paradigm and modeling approach that achieves the objectives of both business process modeling and use case modeling in an integrated form, in one analysis pass, results in time savings, increased accuracy and improved communication among all participants in the systems development process.Analysis of several case studies will show that inefficiency, wasted time and overuse of stakeholder resource time results from the separate application of business process modeling and use case modeling. A review of existing literature on the subject shows that while the problem of modeling both business process and use case information in a coordinated fashion has been recognized before, there are few if any approaches that have been proposed to reconcile and integrate the two methods. Based on both literature review and good modeling practices, a list of goals for the new paradigm and modeling approach forms the basis for the paradigm to be created.A grounded theory study is then conducted to analyze existing modeling approaches for both business processes and use cases and to provide an underlying theory on which to base the new paradigm. The two main innovations developed for the new paradigm are the usage process and the timebox. Usage processes allow system usages (use cases) to be identified as the business process model is developed, and the two to be shown in a combined process flow. Timeboxes allow processes to be positioned in time-relation to each other without the need to combine processes into higher level processes using causal relations that may not exist. The combination of usage processes and timeboxes allows any level of complex behavior to be modeled in one pass, without the redundancy and waste of separate business process and use case modeling work.Several pilot projects are conducted to test the new modeling paradigm in differing modeling situations with participants and subject matter experts asked to compare the traditional models with the new paradigm formulations.
Show less
-
Date Issued
-
2015
-
Identifier
-
CFE0005583, ucf:50270
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0005583
-
-
Title
-
An Analysis of Undergraduate Creative Writing Students'Writing Processes: Gauging the Workshop Models' Effectiveness Through the Lens of Genre Theories.
-
Creator
-
Chrisman, John, Marinara, Martha, Roozen, Kevin, Scott, Blake, University of Central Florida
-
Abstract / Description
-
Current approaches to teaching creative writers the ways to success in creative writing courses consist largely of workshop style classes. While workshops often vary from class to class in style, generally a workshop will consist of a group of writers, led by a mentor/instructor, who exchange drafts and provide reader and writer focused feedback to the author. Yet because the workshop approach has not been the subject of close empirical study, it is unclear whether it is an effective pedagogy...
Show moreCurrent approaches to teaching creative writers the ways to success in creative writing courses consist largely of workshop style classes. While workshops often vary from class to class in style, generally a workshop will consist of a group of writers, led by a mentor/instructor, who exchange drafts and provide reader and writer focused feedback to the author. Yet because the workshop approach has not been the subject of close empirical study, it is unclear whether it is an effective pedagogy. This thesis serves two purposes. First, it presents an argument for new research into creative writing pedagogy and creative writers' processes and suggests that any future research should take an empirical turn. However, because creative writing has developed few theories or methods useful for the empirical study of creative writing, I suggest adopting theories and methods from the field of rhetoric and composition. The second part of this thesis is an empirical study of three creative writing undergraduate students in an introductory creative writing course over one semester. This study uses qualitative methods: semi-structured retrospective interviews, close textual analysis, and in-class observations to understand how creative writers are enculturated into the creative writing community using Christine Tardy's theories of acquiring genre expertise as a framework for analysis. Based on this research this study concludes that while creative writers enculturate in different ways, based on several factors, all creative writers develop greater awareness of genre complexity, authorial identity, and intermodal influences on their writing. Furthermore, this study recommends further case studies into creative writers writing processes and the effectiveness of various workshop models on student enculturation. ?
Show less
-
Date Issued
-
2015
-
Identifier
-
CFE0005589, ucf:50235
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0005589
-
-
Title
-
Investigating the relationships between preferences, gender, and high school students' geometry performance.
-
Creator
-
Mainali, Bhesh, Haciomeroglu, Erhan, Dixon, Juli, Andreasen, Janet, Bai, Haiyan, University of Central Florida
-
Abstract / Description
-
In this quantitative study, the relationships between high school students' preference for solution methods, geometry performance, task difficulty, and gender were investigated. The data was collected from 161 high school students from six different schools at a county located in central Florida in the United States. The study was conducted during the 2013(-)2014 school year. The participants represented a wide range in socioeconomic status, were from a range of grades (10-12), and were...
Show moreIn this quantitative study, the relationships between high school students' preference for solution methods, geometry performance, task difficulty, and gender were investigated. The data was collected from 161 high school students from six different schools at a county located in central Florida in the United States. The study was conducted during the 2013(-)2014 school year. The participants represented a wide range in socioeconomic status, were from a range of grades (10-12), and were enrolled in different mathematics courses (Algebra 2, Geometry, Financial Algebra, and Pre-calculus). Data were collected primarily with the aid of a geometry test and a geometry questionnaire. Using a think-aloud protocol, a short interview was also conducted with some students.For the purpose of statistical analysis, students' preferences for solution methods were quantified into numeric values, and then a visuality score was obtained for each student. Students' visuality scores ranged from -12 to +12. The visuality scores were used to assess students' preference for solution methods. A standardized test score was used to measure students' geometry performance. The data analysis indicated that the majority of students were visualizers. The statistical analysis revealed that there was not an association between preference for solution methods and students' geometry performance. The preference for solving geometry problems using either visual or nonvisual methods was not influenced by task difficulty. Students were equally likely to employ visual as well as nonvisual solution methods regardless of the task difficulty. Gender was significant in geometry performance but not in preference for solution methods. Female students' geometry performance was significantly higher than male students' geometry performance. The findings of this study suggested that instruction should be focused on incorporating both visual and nonvisual teaching strategies in mathematics lesson activities in order to develop preference for both visual and nonvisual solution methods.
Show less
-
Date Issued
-
2014
-
Identifier
-
CFE0005374, ucf:50448
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0005374
-
-
Title
-
SAME-SEX MARRIAGE: A FUNDAMENTAL RIGHT.
-
Creator
-
Smith, Stefen, Naccarato-Fromang, Gina, University of Central Florida
-
Abstract / Description
-
Same-sex marriage is a subject that has been heavily discussed and argued since the concept of marriage came into existence. Marriage is a relationship that most American citizens are entitled to although it is not yet a fundamental right. As of a very recent court decision, Strawser v. Strange, Civil Action No. 14-0424-CG-C finalized on February 9, 2015, Alabama has legalized same-sex marriage; furthermore, thirty-seven states now recognize the legality of same-sex marriage. Marriage,...
Show moreSame-sex marriage is a subject that has been heavily discussed and argued since the concept of marriage came into existence. Marriage is a relationship that most American citizens are entitled to although it is not yet a fundamental right. As of a very recent court decision, Strawser v. Strange, Civil Action No. 14-0424-CG-C finalized on February 9, 2015, Alabama has legalized same-sex marriage; furthermore, thirty-seven states now recognize the legality of same-sex marriage. Marriage, whether it is between a heterosexual or a homosexual couple, should be a fundamental right enjoyed by all. This thesis will explain why same-sex marriage should be a fundamental right. The research presented in this thesis will be scrutinized and thoroughly examined showing the obstacles that same-sex couples face when wanting to legally marry. The United States Constitution, the Due Process Clause, and the Equal Protection Clause will be analyzed and discussed to prove that all fifty states should allow same-sex couples to wed. Citizens view what constitutes a marriage differently depending on their upbringing and residence. This thesis will illustrate why same-sex marriage has been such a widely discussed topic, and it will investigate the influence of religion and the church. Historically, the tradition of marriage has always been between one man and one woman. By examining how the tradition of marriage is changing and using case law decisions, an argument can be formed that marriage should be a fundamental right for all people.
Show less
-
Date Issued
-
2015
-
Identifier
-
CFH0004779, ucf:45391
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFH0004779
-
-
Title
-
Harnessing Spatial Intensity Fluctuations for Optical Imaging and Sensing.
-
Creator
-
Akhlaghi Bouzan, Milad, Dogariu, Aristide, Saleh, Bahaa, Pang, Sean, Atia, George, University of Central Florida
-
Abstract / Description
-
Properties of light such as amplitude and phase, temporal and spatial coherence, polarization, etc. are abundantly used for sensing and imaging. Regardless of the passive or active nature of the sensing method, optical intensity fluctuations are always present! While these fluctuations are usually regarded as noise, there are situations where one can harness the intensity fluctuations to enhance certain attributes of the sensing procedure. In this thesis, we developed different sensing...
Show moreProperties of light such as amplitude and phase, temporal and spatial coherence, polarization, etc. are abundantly used for sensing and imaging. Regardless of the passive or active nature of the sensing method, optical intensity fluctuations are always present! While these fluctuations are usually regarded as noise, there are situations where one can harness the intensity fluctuations to enhance certain attributes of the sensing procedure. In this thesis, we developed different sensing methodologies that use statistical properties of optical fluctuations for gauging specific information. We examine this concept in the context of three different aspects of computational optical imaging and sensing. First, we study imposing specific statistical properties to the probing field to image or characterize certain properties of an object through a statistical analysis of the spatially integrated scattered intensity. This offers unique capabilities for imaging and sensing techniques operating in highly perturbed environments and low-light conditions. Next, we examine optical sensing in the presence of strong perturbations that preclude any controllable field modification. We demonstrate that inherent properties of diffused coherent fields and fluctuations of integrated intensity can be used to track objects hidden behind obscurants. Finally, we address situations where, due to coherent noise, image accuracy is severely degraded by intensity fluctuations. By taking advantage of the spatial coherence properties of optical fields, we show that this limitation can be effectively mitigated and that a significant improvement in the signal-to-noise ratio can be achieved even in one single-shot measurement. The findings included in this dissertation illustrate different circumstances where optical fluctuations can affect the efficacy of computational optical imaging and sensing. A broad range of applications, including biomedical imaging and remote sensing, could benefit from the new approaches to suppress, enhance, and exploit optical fluctuations, which are described in this dissertation.
Show less
-
Date Issued
-
2017
-
Identifier
-
CFE0007274, ucf:52200
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0007274
-
-
Title
-
Automated Synthesis of Unconventional Computing Systems.
-
Creator
-
Hassen, Amad Ul, Jha, Sumit Kumar, Sundaram, Kalpathy, Fan, Deliang, Ewetz, Rickard, Rahman, Talat, University of Central Florida
-
Abstract / Description
-
Despite decades of advancements, modern computing systems which are based on the von Neumann architecture still carry its shortcomings. Moore's law, which had substantially masked the effects of the inherent memory-processor bottleneck of the von Neumann architecture, has slowed down due to transistor dimensions nearing atomic sizes. On the other hand, modern computational requirements, driven by machine learning, pattern recognition, artificial intelligence, data mining, and IoT, are growing...
Show moreDespite decades of advancements, modern computing systems which are based on the von Neumann architecture still carry its shortcomings. Moore's law, which had substantially masked the effects of the inherent memory-processor bottleneck of the von Neumann architecture, has slowed down due to transistor dimensions nearing atomic sizes. On the other hand, modern computational requirements, driven by machine learning, pattern recognition, artificial intelligence, data mining, and IoT, are growing at the fastest pace ever. By their inherent nature, these applications are particularly affected by communication-bottlenecks, because processing them requires a large number of simple operations involving data retrieval and storage. The need to address the problems associated with conventional computing systems at the fundamental level has given rise to several unconventional computing paradigms. In this dissertation, we have made advancements for automated syntheses of two types of unconventional computing paradigms: in-memory computing and stochastic computing. In-memory computing circumvents the problem of limited communication bandwidth by unifying processing and storage at the same physical locations. The advent of nanoelectronic devices in the last decade has made in-memory computing an energy-, area-, and cost-effective alternative to conventional computing. We have used Binary Decision Diagrams (BDDs) for in-memory computing on memristor crossbars. Specifically, we have used Free-BDDs, a special class of binary decision diagrams, for synthesizing crossbars for flow-based in-memory computing. Stochastic computing is a re-emerging discipline with several times smaller area/power requirements as compared to conventional computing systems. It is especially suited for fault-tolerant applications like image processing, artificial intelligence, pattern recognition, etc. We have proposed a decision procedures-based iterative algorithm to synthesize Linear Finite State Machines (LFSM) for stochastically computing non-linear functions such as polynomials, exponentials, and hyperbolic functions.
Show less
-
Date Issued
-
2019
-
Identifier
-
CFE0007648, ucf:52462
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0007648
-
-
Title
-
A Crack in Everything.
-
Creator
-
Hoffman, Jeffrey, Isenhour, David, Poindexter, Carla, Kim, Joo, University of Central Florida
-
Abstract / Description
-
Contained herein is a close examination of self-awareness and self-portraiture as it applies to the works of artist Jeffrey Hoffman. Water, frozen into various forms and combined with natural elements of wood, slowly melt over an indeterminable amount of time, each droplet documented as the process transforms the elements. Through this process, we see change. We see time. We see truth. This documentation of change and time through natural elements is where the artwork comes full circle....
Show moreContained herein is a close examination of self-awareness and self-portraiture as it applies to the works of artist Jeffrey Hoffman. Water, frozen into various forms and combined with natural elements of wood, slowly melt over an indeterminable amount of time, each droplet documented as the process transforms the elements. Through this process, we see change. We see time. We see truth. This documentation of change and time through natural elements is where the artwork comes full circle. Working with new media to explore man's interconnectivity to life, energy, and the cosmos, he produces time based installations, photographs, videos, and sculptures that serve as both existential metaphors and Tantric symbols. With the use of digital cameras and video, a record is created by which the disintegration which occurs from the unseen forces of gravity, heat and time upon sculptures made from natural elements and ice is examined. In its sculptural form, his work can be categorized as Installation art and Performance art due to its evolving nature. Each piece is intended to either change over time or to have that change halted by another temporal force like that of flowing electricity. The possibility of allowing varying levels of self-awareness to emerge through self portraiture is also examined. The existential, as well as the metaphysical, can be present in a physical form when the form is imbued with evidence of an evolutionary process. In many ways, the work serves as a self portrait. It is a means for Hoffman to examine his own existentialism as a student of the modern western world and life.
Show less
-
Date Issued
-
2012
-
Identifier
-
CFE0004242, ucf:49518
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0004242
-
-
Title
-
EPISODIC MEMORY MODEL FOR EMBODIED CONVERSATIONAL AGENTS.
-
Creator
-
Elvir, Miguel, Gonzalez, Avelino, University of Central Florida
-
Abstract / Description
-
Embodied Conversational Agents (ECA) form part of a range of virtual characters whose intended purpose include engaging in natural conversations with human users. While works in literature are ripe with descriptions of attempts at producing viable ECA architectures, few authors have addressed the role of episodic memory models in conversational agents. This form of memory, which provides a sense of autobiographic record-keeping in humans, has only recently been peripherally integrated into...
Show moreEmbodied Conversational Agents (ECA) form part of a range of virtual characters whose intended purpose include engaging in natural conversations with human users. While works in literature are ripe with descriptions of attempts at producing viable ECA architectures, few authors have addressed the role of episodic memory models in conversational agents. This form of memory, which provides a sense of autobiographic record-keeping in humans, has only recently been peripherally integrated into dialog management tools for ECAs. In our work, we propose to take a closer look at the shared characteristics of episodic memory models in recent examples from the field. Additionally, we propose several enhancements to these existing models through a unified episodic memory model for ECAÃÂ's. As part of our research into episodic memory models, we present a process for determining the prevalent contexts in the conversations obtained from the aforementioned interactions. The process presented demonstrates the use of statistical and machine learning services, as well as Natural Language Processing techniques to extract relevant snippets from conversations. Finally, mechanisms to store, retrieve, and recall episodes from previous conversations are discussed. A primary contribution of this research is in the context of contemporary memory models for conversational agents and cognitive architectures. To the best of our knowledge, this is the first attempt at providing a comparative summary of existing works. As implementations of ECAs become more complex and encompass more realistic conversation engines, we expect that episodic memory models will continue to evolve and further enhance the naturalness of conversations.
Show less
-
Date Issued
-
2010
-
Identifier
-
CFE0003353, ucf:48443
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0003353
-
-
Title
-
DEVELOPMENT OF THEORETICAL AND COMPUTATIONAL METHODS FOR THREE-BODY PROCESSES.
-
Creator
-
Blandon Zapata, Juan, Kokoouline, Viatcheslav, University of Central Florida
-
Abstract / Description
-
This thesis discusses the development and application of theoretical and computational methods to study three-body processes. The main focus is on the calculation of three-body resonances and bound states. This broadly includes the study of Efimov states and resonances, three-body shape resonances, three-body Feshbach resonances, three-body pre-dissociated states in systems with a conical intersection, and the calculation of three-body recombination rate coefficients. The method was applied...
Show moreThis thesis discusses the development and application of theoretical and computational methods to study three-body processes. The main focus is on the calculation of three-body resonances and bound states. This broadly includes the study of Efimov states and resonances, three-body shape resonances, three-body Feshbach resonances, three-body pre-dissociated states in systems with a conical intersection, and the calculation of three-body recombination rate coefficients. The method was applied to a number of systems. A chapter of the thesis is dedicated to the related study of deriving correlation diagrams for three-body states before and after a three-body collision. More specifically, the thesis discusses the calculation of the H+H+H three-body recombination rate coefficient using the developed method. Additionally, we discuss a conceptually simple and effective diabatization procedure for the calculation of pre-dissociated vibrational states for a system with a conical intersection. We apply the method to H_3, where the quantum molecular dynamics are notoriously difficult and where non-adiabatic couplings are important, and a correct description of the geometric phase associated with the diabatic representation is crucial for an accurate representation of these couplings. With our approach, we were also able to calculate Efimov-type resonances. The calculations of bound states and resonances were performed by formulating the problem in hyperspherical coordinates, and obtaining three-body eigenstates and eigen-energies by applying the hyperspherical adiabatic separation and the slow variable discretization. We employed the complex absorbing potential to calculate resonance energies and lifetimes, and introduce an uniquely defined diabatization procedure to treat X_3 molecules with a conical intersection. The proposed approach is general enough to be applied to problems in nuclear, atomic, molecular and astrophysics.
Show less
-
Date Issued
-
2009
-
Identifier
-
CFE0002669, ucf:48225
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0002669
-
-
Title
-
7: AN INTERACTIVE INSTALLATION; EXPLORATIONS IN THE DIGITAL, THE SPIRITUAL, AND THE UNCANNY.
-
Creator
-
Lewter, Bradley, Peters, Phil, University of Central Florida
-
Abstract / Description
-
This thesis explores the application of digital technologies in the creation of visionary or transformative artwork. The installation emphasizes number, color, symmetry, and the human form to create symbolic compositions patterned after ancient archetypes. Background research was done to inform the work through studies of the principles of visionary and transformative artwork as practiced by Ernst Fuchs, De Es Schwertberger, and Alex Grey. Connections between art and spirituality as explained...
Show moreThis thesis explores the application of digital technologies in the creation of visionary or transformative artwork. The installation emphasizes number, color, symmetry, and the human form to create symbolic compositions patterned after ancient archetypes. Background research was done to inform the work through studies of the principles of visionary and transformative artwork as practiced by Ernst Fuchs, De Es Schwertberger, and Alex Grey. Connections between art and spirituality as explained by Kandinsky were studied to augment these principles. The sequence of artwork within the installation is comprised of both digital paintings and interactive triptych panels. To convey a sense of the mystical or sacred, the Rothko Chapel was used to inform the installation and serve as an artistic precedent. As the interactive work is created using realistically-modeled, computer generated characters, special consideration was given to understanding the "uncanny valley" and its potential effect in the interpretation of the installation. Interactivity is achieved through the use of ultrasonic sensors and Arduino prototyping boards.
Show less
-
Date Issued
-
2010
-
Identifier
-
CFE0003314, ucf:48487
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0003314
-
-
Title
-
Nonlinear Optical Response of Simple Molecules and Two-Photon Semiconductor Lasers.
-
Creator
-
Reichert, Matthew, Vanstryland, Eric, Hagan, David, Likamwa, Patrick, Peale, Robert, University of Central Florida
-
Abstract / Description
-
This dissertation investigates two long standing issues in nonlinear optics: complete characterization of the ultrafast dynamics of simple molecules, and the potential of a two-photon laser using a bulk semiconductor gain medium. Within the Born-Oppenheimer approximation, nonlinear refraction in molecular liquids and gases can arise from both bound-electronic and nuclear origins. Knowledge of the magnitudes, temporal dynamics, polarization and spectral dependences of each of these mechanisms...
Show moreThis dissertation investigates two long standing issues in nonlinear optics: complete characterization of the ultrafast dynamics of simple molecules, and the potential of a two-photon laser using a bulk semiconductor gain medium. Within the Born-Oppenheimer approximation, nonlinear refraction in molecular liquids and gases can arise from both bound-electronic and nuclear origins. Knowledge of the magnitudes, temporal dynamics, polarization and spectral dependences of each of these mechanisms is important for many applications including filamentation, white-light continuum generation, all-optical switching, and nonlinear spectroscopy. In this work the nonlinear dynamics of molecules are investigated in both liquid and gas phase with the recently developed beam deflection technique which measures nonlinear refraction directly in the time domain. Thanks to the utility of the beam deflection technique we are able to completely determine the third-order response function of one of the most important molecular liquids in nonlinear optics, carbon disulfide. This allows the prediction of essentially any nonlinear refraction or two-photon absorption experiment on CS2. Measurements conducted on air (N2 and O2) and gaseous CS2 reveal coherent rotational revivals in the degree of alignment of the ensemble at a period that depends on its moment of inertia. This allows measurement of the rotational and centrifugal distortion constants of the isolated molecules. Additionally, the rotational contribution to the beam deflection measurement can be eliminated thanks to the particular polarization dependence of the mechanism. At a specific polarization, the dominant remaining contribution is due to the bound-electrons. Thus both the bound-electronic nonlinear refractive index of air, and second hyperpolarizability of isolated CS2 molecules, are measured directly. The later agrees well with liquid CS2 measurements, where local field effects are significant. The second major portion of this dissertation addresses the possibility of using bulk semiconductors as a two-photon gain medium. A two-photon laser has been a goal of nonlinear optics since shortly after the original laser's development. In this case, two-photons are emitted from a single electronic transition rather than only one. This processes is known as two-photon gain (2PG). Semiconductors have large two-photon absorption coefficients, which are enhanced by ~2 orders of magnitude when using photons of very different energies, e.g., ??_a?10??_b. This enhancement should translate into large 2PG coefficients as well, given the inverse relationship between absorption and gain. Here, we experimentally demonstrate both degenerate and nondegenerate 2PG in optically excited bulk GaAs via pump-probe experiments. This constitutes, to my knowledge, the first report of nondegenerate two-photon gain. Competition between 2PG and competing processes, namely intervalence band and nondegenerate three-photon absorption (ND-3PA), in both cases are theoretically analyzed. Experimental measurements of ND-3PA agree with this analysis and show that it is enhanced much more than ND-2PG. It is found for both degenerate and nondegenerate photon pairs that the losses dominate the two-photon gain, preventing the possibility of a two-photon semiconductor laser.
Show less
-
Date Issued
-
2015
-
Identifier
-
CFE0005874, ucf:50871
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0005874
-
-
Title
-
Data-Driven Simulation Modeling of Construction and Infrastructure Operations Using Process Knowledge Discovery.
-
Creator
-
Akhavian, Reza, Behzadan, Amir, Oloufa, Amr, Yun, Hae-Bum, Sukthankar, Gita, Zheng, Qipeng, University of Central Florida
-
Abstract / Description
-
Within the architecture, engineering, and construction (AEC) domain, simulation modeling is mainly used to facilitate decision-making by enabling the assessment of different operational plans and resource arrangements, that are otherwise difficult (if not impossible), expensive, or time consuming to be evaluated in real world settings. The accuracy of such models directly affects their reliability to serve as a basis for important decisions such as project completion time estimation and...
Show moreWithin the architecture, engineering, and construction (AEC) domain, simulation modeling is mainly used to facilitate decision-making by enabling the assessment of different operational plans and resource arrangements, that are otherwise difficult (if not impossible), expensive, or time consuming to be evaluated in real world settings. The accuracy of such models directly affects their reliability to serve as a basis for important decisions such as project completion time estimation and resource allocation. Compared to other industries, this is particularly important in construction and infrastructure projects due to the high resource costs and the societal impacts of these projects. Discrete event simulation (DES) is a decision making tool that can benefit the process of design, control, and management of construction operations. Despite recent advancements, most DES models used in construction are created during the early planning and design stage when the lack of factual information from the project prohibits the use of realistic data in simulation modeling. The resulting models, therefore, are often built using rigid (subjective) assumptions and design parameters (e.g. precedence logic, activity durations). In all such cases and in the absence of an inclusive methodology to incorporate real field data as the project evolves, modelers rely on information from previous projects (a.k.a. secondary data), expert judgments, and subjective assumptions to generate simulations to predict future performance. These and similar shortcomings have to a large extent limited the use of traditional DES tools to preliminary studies and long-term planning of construction projects.In the realm of the business process management, process mining as a relatively new research domain seeks to automatically discover a process model by observing activity records and extracting information about processes. The research presented in this Ph.D. Dissertation was in part inspired by the prospect of construction process mining using sensory data collected from field agents. This enabled the extraction of operational knowledge necessary to generate and maintain the fidelity of simulation models. A preliminary study was conducted to demonstrate the feasibility and applicability of data-driven knowledge-based simulation modeling with focus on data collection using wireless sensor network (WSN) and rule-based taxonomy of activities. The resulting knowledge-based simulation models performed very well in properly predicting key performance measures of real construction systems. Next, a pervasive mobile data collection and mining technique was adopted and an activity recognition framework for construction equipment and worker tasks was developed. Data was collected using smartphone accelerometers and gyroscopes from construction entities to generate significant statistical time- and frequency-domain features. The extracted features served as the input of different types of machine learning algorithms that were applied to various construction activities. The trained predictive algorithms were then used to extract activity durations and calculate probability distributions to be fused into corresponding DES models. Results indicated that the generated data-driven knowledge-based simulation models outperform static models created based upon engineering assumptions and estimations with regard to compatibility of performance measure outputs to reality.
Show less
-
Date Issued
-
2015
-
Identifier
-
CFE0006023, ucf:51014
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0006023
-
-
Title
-
Biophysical Sources of 1/f Noises in Neurological Systems.
-
Creator
-
Paris, Alan, Vosoughi, Azadeh, Atia, George, Wiegand, Rudolf, Douglas, Pamela, Berman, Steven, University of Central Florida
-
Abstract / Description
-
High levels of random noise are a defining characteristic of neurological signals at all levels, from individual neurons up to electroencephalograms (EEG). These random signals degrade the performance of many methods of neuroengineering and medical neuroscience. Understanding this noise also is essential for applications such as real-time brain-computer interfaces (BCIs), which must make accurate control decisions from very short data epochs. The major type of neurological noise is of the so...
Show moreHigh levels of random noise are a defining characteristic of neurological signals at all levels, from individual neurons up to electroencephalograms (EEG). These random signals degrade the performance of many methods of neuroengineering and medical neuroscience. Understanding this noise also is essential for applications such as real-time brain-computer interfaces (BCIs), which must make accurate control decisions from very short data epochs. The major type of neurological noise is of the so-called 1/f-type, whose origins and statistical nature has remained unexplained for decades. This research provides the first simple explanation of 1/f-type neurological noise based on biophysical fundamentals. In addition, noise models derived from this theory provide validated algorithm performance improvements over alternatives.Specifically, this research defines a new class of formal latent-variable stochastic processes called hidden quantum models (HQMs) which clarify the theoretical foundations of ion channel signal processing. HQMs are based on quantum state processes which formalize time-dependent observation. They allow the quantum-based calculation of channel conductance autocovariance functions, essential for frequency-domain signal processing. HQMs based on a particular type of observation protocol called independent activated measurements are shown to be distributionally equivalent to hidden Markov models yet without an underlying physical Markov process. Since the formal Markov processes are non-physical, the theory of activated measurement allows merging energy-based Eyring rate theories of ion channel behavior with the more common phenomenological Markov kinetic schemes to form energy-modulated quantum channels. These unique biophysical concepts developed to understand the mechanisms of ion channel kinetics have the potential of revolutionizing our understanding of neurological computation.To apply this theory, the simplest quantum channel model consistent with neuronal membrane voltage-clamp experiments is used to derive the activation eigenenergies for the Hodgkin-Huxley K+ and Na+ ion channels. It is shown that maximizing entropy under constrained activation energy yields noise spectral densities approximating S(f) = 1/f, thus offering a biophysical explanation for this ubiquitous noise component. These new channel-based noise processes are called generalized van der Ziel-McWhorter (GVZM) power spectral densities (PSDs). This is the only known EEG noise model that has a small, fixed number of parameters, matches recorded EEG PSD's with high accuracy from 0 Hz to over 30 Hz without infinities, and has approximately 1/f behavior in the mid-frequencies. In addition to the theoretical derivation of the noise statistics from ion channel stochastic processes, the GVZM model is validated in two ways. First, a class of mixed autoregressive models is presented which simulate brain background noise and whose periodograms are proven to be asymptotic to the GVZM PSD. Second, it is shown that pairwise comparisons of GVZM-based algorithms, using real EEG data from a publicly-available data set, exhibit statistically significant accuracy improvement over two well-known and widely-used steady-state visual evoked potential (SSVEP) estimators.
Show less
-
Date Issued
-
2016
-
Identifier
-
CFE0006485, ucf:51418
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0006485
-
-
Title
-
Towards Energy-Efficient and Reliable Computing: From Highly-Scaled CMOS Devices to Resistive Memories.
-
Creator
-
Salehi Mobarakeh, Soheil, DeMara, Ronald, Fan, Deliang, Turgut, Damla, University of Central Florida
-
Abstract / Description
-
The continuous increase in transistor density based on Moore's Law has led us to highly scaled Complementary Metal-Oxide Semiconductor (CMOS) technologies. These transistor-based process technologies offer improved density as well as a reduction in nominal supply voltage. An analysis regarding different aspects of 45nm and 15nm technologies, such as power consumption and cell area to compare these two technologies is proposed on an IEEE 754 Single Precision Floating-Point Unit implementation....
Show moreThe continuous increase in transistor density based on Moore's Law has led us to highly scaled Complementary Metal-Oxide Semiconductor (CMOS) technologies. These transistor-based process technologies offer improved density as well as a reduction in nominal supply voltage. An analysis regarding different aspects of 45nm and 15nm technologies, such as power consumption and cell area to compare these two technologies is proposed on an IEEE 754 Single Precision Floating-Point Unit implementation. Based on the results, using the 15nm technology offers 4-times less energy and 3-fold smaller footprint. New challenges also arise, such as relative proportion of leakage power in standby mode that can be addressed by post-CMOS technologies. Spin-Transfer Torque Random Access Memory (STT-MRAM) has been explored as a post-CMOS technology for embedded and data storage applications seeking non-volatility, near-zero standby energy, and high density. Towards attaining these objectives for practical implementations, various techniques to mitigate the specific reliability challenges associated with STT-MRAM elements are surveyed, classified, and assessed herein. Cost and suitability metrics assessed include the area of nanomagmetic and CMOS components per bit, access time and complexity, Sense Margin (SM), and energy or power consumption costs versus resiliency benefits. In an attempt to further improve the Process Variation (PV) immunity of the Sense Amplifiers (SAs), a new SA has been introduced called Adaptive Sense Amplifier (ASA). ASA can benefit from low Bit Error Rate (BER) and low Energy Delay Product (EDP) by combining the properties of two of the commonly used SAs, Pre-Charge Sense Amplifier (PCSA) and Separated Pre-Charge Sense Amplifier (SPCSA). ASA can operate in either PCSA or SPCSA mode based on the requirements of the circuit such as energy efficiency or reliability. Then, ASA is utilized to propose a novel approach to actually leverage the PV in Non-Volatile Memory (NVM) arrays using Self-Organized Sub-bank (SOS) design. SOS engages the preferred SA alternative based on the intrinsic as-built behavior of the resistive sensing timing margin to reduce the latency and power consumption while maintaining acceptable access time.
Show less
-
Date Issued
-
2016
-
Identifier
-
CFE0006493, ucf:51400
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0006493
-
-
Title
-
Design Disjunction for Resilient Reconfigurable Hardware.
-
Creator
-
Alzahrani, Ahmad, DeMara, Ronald, Yuan, Jiann-Shiun, Lin, Mingjie, Wang, Jun, Turgut, Damla, University of Central Florida
-
Abstract / Description
-
Contemporary reconfigurable hardware devices have the capability to achieve high performance, powerefficiency, and adaptability required to meet a wide range of design goals. With scaling challenges facing current complementary metal oxide semiconductor (CMOS), new concepts and methodologies supportingefficient adaptation to handle reliability issues are becoming increasingly prominent. Reconfigurable hardware and their ability to realize self-organization features are expected to play a key...
Show moreContemporary reconfigurable hardware devices have the capability to achieve high performance, powerefficiency, and adaptability required to meet a wide range of design goals. With scaling challenges facing current complementary metal oxide semiconductor (CMOS), new concepts and methodologies supportingefficient adaptation to handle reliability issues are becoming increasingly prominent. Reconfigurable hardware and their ability to realize self-organization features are expected to play a key role in designingfuture dependable hardware architectures. However, the exponential increase in density and complexity of current commercial SRAM-based field-programmable gate arrays (FPGAs) has escalated the overheadassociated with dynamic runtime design adaptation. Traditionally, static modular redundancy techniques areconsidered to surmount this limitation; however, they can incur substantial overheads in both area andpower requirements. To achieve a better trade-off among performance, area, power, and reliability, thisresearch proposes design-time approaches that enable fine selection of redundancy level based on target reliability goals and autonomous adaptation to runtime demands. To achieve this goal, three studies were conducted:First, a graph and set theoretic approach, named Hypergraph-Cover Diversity (HCD), is introduced as a preemptive design technique to shift the dominant costs of resiliency to design-time. In particular, union-freehypergraphs are exploited to partition the reconfigurable resources pool into highly separable subsets ofresources, each of which can be utilized by the same synthesized application netlist. The diverseimplementations provide reconfiguration-based resilience throughout the system lifetime while avoiding thesignificant overheads associated with runtime placement and routing phases. Evaluation on a Motion-JPEGimage compression core using a Xilinx 7-series-based FPGA hardware platform has demonstrated thepotential of the proposed FT method to achieve 37.5% area saving and up to 66% reduction in powerconsumption compared to the frequently-used TMR scheme while providing superior fault tolerance.Second, Design Disjunction based on non-adaptive group testing is developed to realize a low-overheadfault tolerant system capable of handling self-testing and self-recovery using runtime partial reconfiguration.Reconfiguration is guided by resource grouping procedures which employ non-linear measurements given by the constructive property of f-disjunctness to extend runtime resilience to a large fault space and realize a favorable range of tradeoffs. Disjunct designs are created using the mosaic convergence algorithmdeveloped such that at least one configuration in the library evades any occurrence of up to d resource faults, where d is lower-bounded by f. Experimental results for a set of MCNC and ISCAS benchmarks havedemonstrated f-diagnosability at the individual slice level with average isolation resolution of 96.4% (94.4%) for f=1 (f=2) while incurring an average critical path delay impact of only 1.49% and area cost roughly comparable to conventional 2-MR approaches. Finally, the proposed Design Disjunction method is evaluated as a design-time method to improve timing yield in the presence of large random within-die (WID) process variations for application with a moderately high production capacity.
Show less
-
Date Issued
-
2015
-
Identifier
-
CFE0006250, ucf:51086
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0006250
Pages