Current Search: system (x)
Pages
-
-
Title
-
DYNAMIC SHARED STATE MAINTENANCE IN DISTRIBUTED VIRTUAL ENVIRONMENTS.
-
Creator
-
Hamza-Lup, Felix George, Hughes, Charles, University of Central Florida
-
Abstract / Description
-
Advances in computer networks and rendering systems facilitate the creation of distributed collaborative environments in which the distribution of information at remote locations allows efficient communication. Particularly challenging are distributed interactive Virtual Environments (VE) that allow knowledge sharing through 3D information. In a distributed interactive VE the dynamic shared state represents the changing information that multiple machines must maintain about the shared virtual...
Show moreAdvances in computer networks and rendering systems facilitate the creation of distributed collaborative environments in which the distribution of information at remote locations allows efficient communication. Particularly challenging are distributed interactive Virtual Environments (VE) that allow knowledge sharing through 3D information. In a distributed interactive VE the dynamic shared state represents the changing information that multiple machines must maintain about the shared virtual components. One of the challenges in such environments is maintaining a consistent view of the dynamic shared state in the presence of inevitable network latency and jitter. A consistent view of the shared scene will significantly increase the sense of presence among participants and facilitate their interactive collaboration. The purpose of this work is to address the problem of latency in distributed interactive VE and to develop a conceptual model for consistency maintenance in these environments based on the participant interaction model.A review of the literature illustrates that the techniques for consistency maintenance in distributed Virtual Reality (VR) environments can be roughly grouped into three categories: centralized information management, prediction through dead reckoning algorithms, and frequent state regeneration. Additional resource management methods can be applied across these techniques for shared state consistency improvement. Some of these techniques are related to the systems infrastructure, others are related to the human nature of the participants (e.g., human perceptual limitations, area of interest management, and visual and temporal perception).An area that needs to be explored is the relationship between the dynamic shared state and the interaction with the virtual entities present in the shared scene. Mixed Reality (MR) and VR environments must bring the human participant interaction into the loop through a wide range of electronic motion sensors, and haptic devices. Part of the work presented here defines a novel criterion for categorization of distributed interactive VE and introduces, as well as analyzes, an adaptive synchronization algorithm for consistency maintenance in such environments. As part of the work, a distributed interactive Augmented Reality (AR) testbed and the algorithm implementation details are presented. Currently the testbed is part of several research efforts at the Optical Diagnostics and Applications Laboratory including 3D visualization applications using custom built head-mounted displays (HMDs) with optical motion tracking and a medical training prototype for endotracheal intubation and medical prognostics. An objective method using quaternion calculus is applied for the algorithm assessment. In spite of significant network latency, results show that the dynamic shared state can be maintained consistent at multiple remotely located sites. In further consideration of the latency problems and in the light of the current trends in interactive distributed VE applications, we propose a hybrid distributed system architecture for sensor-based distributed VE that has the potential to improve the system real-time behavior and scalability.
Show less
-
Date Issued
-
2004
-
Identifier
-
CFE0000096, ucf:46152
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0000096
-
-
Title
-
FACTORS AFFECTING THE PRACTICES OF ISO 9001:2000 QUALITY MANAGEMENT SYSTEM IN SAUDI BUSINESS ORGANIZATIONS.
-
Creator
-
Al-Asiri, Mohammad Mesaad, Elshennawy, Ahamd, University of Central Florida
-
Abstract / Description
-
Since it's release in Dec 2000, there has been a slow movement towards the new version of ISO 9001:2000 by ISO 9000:1994 certified organizations. Of the 561,747 ISO 9000 certified businesses, 167,210 are certified under the new ISO 9001:2000, which is less than 30 % of the total ISO 9000 certified companies. Although many studies have been conducted to understand and assess the practices of ISO 9000:1994 standards, no research has been done to investigate the practices of ISO 9001:2000 in...
Show moreSince it's release in Dec 2000, there has been a slow movement towards the new version of ISO 9001:2000 by ISO 9000:1994 certified organizations. Of the 561,747 ISO 9000 certified businesses, 167,210 are certified under the new ISO 9001:2000, which is less than 30 % of the total ISO 9000 certified companies. Although many studies have been conducted to understand and assess the practices of ISO 9000:1994 standards, no research has been done to investigate the practices of ISO 9001:2000 in Saudi Arabia. This study is designed to investigate the implementation practices of the new ISO 9001:2000 standard in Saudi business organizations. The main objectives of this study are to identify the critical factors that lead to successful implementation of the new standard, to determine what barriers have been encountered during implementation, and to identify the most difficult parts of the standard to comply with. It investigates the perceived benefits that Saudi firms have gained from implementing the system and examines the level of knowledge about ISO 9001:2000 and the perceptions of the new standard among the management teams and staff of ISO registered firms. It determines the level of integration between ISO 9001:2000 and other implemented systems. Furthermore, this study aims to investigate the factors that may explain the Saudi organizations' decisions to implement ISO 9001:2000 in their businesses. To accomplish these research objectives, a questionnaire was developed based on an extensive review of related literature and tested for validity and reliability.The target sample for the study was made up of all ISO 9001:2000 registered sites in Saudi Arabia up to 31 Dec. 2002, which comprised 131 organizations. A total of 89 completed surveys were received, for a response rate of 72%. Descriptive statistics, measurement of variation, and association, and factor analysis were used in the interpretation of collected data.
Show less
-
Date Issued
-
2004
-
Identifier
-
CFE0000137, ucf:46194
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0000137
-
-
Title
-
SYSTEM IDENTIFICATION AND FAULT DETECTION OF COMPLEX SYSTEMS.
-
Creator
-
Luo, Dapeng, Leonessa, Alexander, University of Central Florida
-
Abstract / Description
-
The proposed research is devoted to devising system identification and fault detection approaches and algorithms for a system characterized by nonlinear dynamics. Mathematical models of dynamical systems and fault models are built based on observed data from systems. In particular, we will focus on statistical subspace instrumental variable methods which allow the consideration of an appealing mathematical model in many control applications consisting of a nonlinear feedback system with...
Show moreThe proposed research is devoted to devising system identification and fault detection approaches and algorithms for a system characterized by nonlinear dynamics. Mathematical models of dynamical systems and fault models are built based on observed data from systems. In particular, we will focus on statistical subspace instrumental variable methods which allow the consideration of an appealing mathematical model in many control applications consisting of a nonlinear feedback system with nonlinearities at both inputs and outputs. Different solutions within the proposed framework are presented to solve the system identification and fault detection problems. Specifically, Augmented Subspace Instrumental Variable Identification (ASIVID) approaches are proposed to identify the closed-loop nonlinear Hammerstein systems. Then fast approaches are presented to determine the system order. Hard-over failures are detected by order determination approaches when failures manifest themselves as rank deficiencies of the dynamical systems. Geometric interpretations of subspace tracking theorems are presented in this dissertation in order to propose a fault tolerance strategy. Possible fields of application considered in this research include manufacturing systems, autonomous vehicle systems, space systems and burgeoning bio-mechanical systems.
Show less
-
Date Issued
-
2006
-
Identifier
-
CFE0000915, ucf:46756
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0000915
-
-
Title
-
SESSION-BASED INTRUSION DETECTION SYSTEM TO MAP ANOMALOUS NETWORK TRAFFIC.
-
Creator
-
Caulkins, Bruce, Wang, Morgan, University of Central Florida
-
Abstract / Description
-
Computer crime is a large problem (CSI, 2004; Kabay, 2001a; Kabay, 2001b). Security managers have a variety of tools at their disposal firewalls, Intrusion Detection Systems (IDSs), encryption, authentication, and other hardware and software solutions to combat computer crime. Many IDS variants exist which allow security managers and engineers to identify attack network packets primarily through the use of signature detection; i.e., the IDS recognizes attack packets due to their well...
Show moreComputer crime is a large problem (CSI, 2004; Kabay, 2001a; Kabay, 2001b). Security managers have a variety of tools at their disposal firewalls, Intrusion Detection Systems (IDSs), encryption, authentication, and other hardware and software solutions to combat computer crime. Many IDS variants exist which allow security managers and engineers to identify attack network packets primarily through the use of signature detection; i.e., the IDS recognizes attack packets due to their well-known "fingerprints" or signatures as those packets cross the network's gateway threshold. On the other hand, anomaly-based ID systems determine what is normal traffic within a network and reports abnormal traffic behavior. This paper will describe a methodology towards developing a more-robust Intrusion Detection System through the use of data-mining techniques and anomaly detection. These data-mining techniques will dynamically model what a normal network should look like and reduce the false positive and false negative alarm rates in the process. We will use classification-tree techniques to accurately predict probable attack sessions. Overall, our goal is to model network traffic into network sessions and identify those network sessions that have a high-probability of being an attack and can be labeled as a "suspect session." Subsequently, we will use these techniques inclusive of signature detection methods, as they will be used in concert with known signatures and patterns in order to present a better model for detection and protection of networks and systems.
Show less
-
Date Issued
-
2005
-
Identifier
-
CFE0000906, ucf:46762
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0000906
-
-
Title
-
EVALUATION OF SPACE SHUTTLE TILE SUBNOMINAL BONDS.
-
Creator
-
Snapp, Cooper, Moslehy, Faissal, University of Central Florida
-
Abstract / Description
-
This study researched the history of Space Shuttle Reusable Surface Insulation which was designed and developed for use on the United States Orbiter fleet to protect from the high heating experienced during reentry through Earth's atmosphere. Specifically the tile system which is attached to the structure by the means of an RTV adhesive has experienced situations where the bonds are identified as subnominal. The history of these subnominal conditions is presented along with a recent...
Show moreThis study researched the history of Space Shuttle Reusable Surface Insulation which was designed and developed for use on the United States Orbiter fleet to protect from the high heating experienced during reentry through Earth's atmosphere. Specifically the tile system which is attached to the structure by the means of an RTV adhesive has experienced situations where the bonds are identified as subnominal. The history of these subnominal conditions is presented along with a recent identification of a subnominal bond between the Strain Isolation Pad and the tile substrate itself. Tests were run to identify the cause of these subnominal conditions and also to show how these conditions were proved to be acceptable for flight. The study also goes into cases that could be used to identify subnominal conditions on tile as a non-destructive test prior to flight. Several options of non-destructive testing were identified and recommendations are given for future research into this topic. A recent topic is also discussed in the instance where gap fillers were identified during the STS-114 mission that did not properly adhere to the substrate. The gap fillers were found protruding past the Outer Mold Line of the vehicle which required an unprecedented spacewalk to remove them to allow for a safe reentry through the atmosphere.
Show less
-
Date Issued
-
2006
-
Identifier
-
CFE0000947, ucf:46754
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0000947
-
-
Title
-
PLANNING AND SCHEDULING FOR LARGE-SCALEDISTRIBUTED SYSTEMS.
-
Creator
-
Yu, Han, Marinescu, Dan, University of Central Florida
-
Abstract / Description
-
Many applications require computing resources well beyond those available on any single system. Simulations of atomic and subatomic systems with application to material science, computations related to study of natural sciences, and computer-aided design are examples of applications that can benefit from the resource-rich environment provided by a large collection of autonomous systems interconnected by high-speed networks. To transform such a collection of systems into a user's virtual...
Show moreMany applications require computing resources well beyond those available on any single system. Simulations of atomic and subatomic systems with application to material science, computations related to study of natural sciences, and computer-aided design are examples of applications that can benefit from the resource-rich environment provided by a large collection of autonomous systems interconnected by high-speed networks. To transform such a collection of systems into a user's virtual machine, we have to develop new algorithms for coordination, planning, scheduling, resource discovery, and other functions that can be automated. Then we can develop societal services based upon these algorithms, which hide the complexity of the computing system for users. In this dissertation, we address the problem of planning and scheduling for large-scale distributed systems. We discuss a model of the system, analyze the need for planning, scheduling, and plan switching to cope with a dynamically changing environment, present algorithms for the three functions, report the simulation results to study the performance of the algorithms, and introduce an architecture for an intelligent large-scale distributed system.
Show less
-
Date Issued
-
2005
-
Identifier
-
CFE0000781, ucf:46595
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0000781
-
-
Title
-
A DIALECTICAL METHODOLOGY FOR DECISION SUPPORT SYSTEMS DESIGN.
-
Creator
-
Elgarah, Wafa, Courtney, James, University of Central Florida
-
Abstract / Description
-
As organizations continue to grow in size, reaching global proportions, they have ever increasing impacts on their environments. Some believe that a much broader array of concerns should be brought into organizational decision-making processes, including greater consideration of social, political, ethical and aesthetic factors (Mitroff and Linstone, 1993; Courtney, 2001). Decision environments such as these are decidedly "wicked" (Rittel and Webber, 1973). Designing decision support systems...
Show moreAs organizations continue to grow in size, reaching global proportions, they have ever increasing impacts on their environments. Some believe that a much broader array of concerns should be brought into organizational decision-making processes, including greater consideration of social, political, ethical and aesthetic factors (Mitroff and Linstone, 1993; Courtney, 2001). Decision environments such as these are decidedly "wicked" (Rittel and Webber, 1973). Designing decision support systems in such environments where there is a high level of interconnectedness, issues are overlapping and a multiplicity of stakeholders is involved, is a very complex task. In this dissertation a methodology for the development of a DSS for wicked situations is proposed using the design theory building process suggested by Walls et al. (1992). This proposed theory is based on dialectic theory and the multiple perspective approach suggested by Linstone and Mitroff (1993). The design process consists of identifying relevant stakeholders, their respective worldviews, and conflicts in these worldviews. A design (thesis) and "counter design" (antithesis) are created, and a prototype systems based on these designs are developed. These prototypes are then presented to the different stakeholder groups who engage in a dialogue which leads to the development of a synthesized design. The process is repeated until all conflicts are resolved or resources are exhausted, and a final system is produced. Using action research and system development research methodologies, the proposed design theory was applied to zoning decision process in Orange County, Florida. The results of this study led to the following: 1. It is feasible to implement the MPDP methodology proposed in this dissertation. 2. The MPDP methodology resulted in a synthesized design that accommodates the different views of the stakeholders. 3. The MPDP methodology is suitable for contentious situations and may not be feasible for structured decisions. 4. Most of the subjects did achieve a more understanding of the decision process. These results suggest that the MPDP design theory can be effective in developing decision support systems in contentious situations.
Show less
-
Date Issued
-
2005
-
Identifier
-
CFE0000883, ucf:46637
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0000883
-
-
Title
-
COORDINATION, MATCHMAKING, AND RESOURCE ALLOCATION FOR LARGE-SCALE DISTRIBUTED SYSTEMS.
-
Creator
-
Bai, Xin, Marinescu, Dan, University of Central Florida
-
Abstract / Description
-
While existing grid environments cater to specific needs of a particular user community, we need to go beyond them and consider general-purpose large-scale distributed systems consisting of large collections of heterogeneous computers and communication systems shared by a large user population with very diverse requirements. Coordination, matchmaking, and resource allocation are among the essential functions of large-scale distributed systems. Although deterministic approaches for...
Show moreWhile existing grid environments cater to specific needs of a particular user community, we need to go beyond them and consider general-purpose large-scale distributed systems consisting of large collections of heterogeneous computers and communication systems shared by a large user population with very diverse requirements. Coordination, matchmaking, and resource allocation are among the essential functions of large-scale distributed systems. Although deterministic approaches for coordination, matchmaking, and resource allocation have been well studied, they are not suitable for large-scale distributed systems due to the large-scale, the autonomy, and the dynamics of the systems. We have to seek for nondeterministic solutions for large-scale distributed systems. In this dissertation we describe our work on a coordination service, a matchmaking service, and a macro-economic resource allocation model for large-scale distributed systems. The coordination service coordinates the execution of complex tasks in a dynamic environment, the matchmaking service supports finding the appropriate resources for users, and the macro-economic resource allocation model allows a broker to mediate resource providers who want to maximize their revenues and resource consumers who want to get the best resources at the lowest possible price, with some global objectives, e.g., to maximize the resource utilization of the system.
Show less
-
Date Issued
-
2006
-
Identifier
-
CFE0001172, ucf:46845
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0001172
-
-
Title
-
DIGITAL CONTROLLER IMPLEMENTATION FOR DISTURBANCE REJECTION IN THE OPTICAL COUPLING OF A MOBILE EXPERIMENTAL LASER TRACKING SYSTEM.
-
Creator
-
Rhodes, Matthew, Richie, Samuel, University of Central Florida
-
Abstract / Description
-
Laser tracking systems are an important aspect of the NASA space program, in particular for conducting research in relation to satellites and space port launch vehicles. Often, launches are conducted at remote sites which require all of the test equipment, including the laser tracking systems, to be portable. Portable systems are more susceptible to environmental disturbances which affect the overall tracking resolution, and consequently, the resolution of any other experimental data being...
Show moreLaser tracking systems are an important aspect of the NASA space program, in particular for conducting research in relation to satellites and space port launch vehicles. Often, launches are conducted at remote sites which require all of the test equipment, including the laser tracking systems, to be portable. Portable systems are more susceptible to environmental disturbances which affect the overall tracking resolution, and consequently, the resolution of any other experimental data being collected at any given time. This research characterizes the optical coupling between two systems in a Mobile Experimental Laser Tracking system and evaluates several control solutions to minimize disturbances within this coupling. A simulation of the optical path was developed in an extensible manner such that different control systems could be easily implemented. For an initial test, several PID controllers were utilized in parallel in order to control mirrors in the optical coupling. Despite many limiting factors of the hardware, a simple proportional control performed to expectations. Although a system implementation was never field tested, the simulation results provide the necessary insight to develop the system further. Recommendations were made for future system modifications which would allow an even higher tracking resolution.
Show less
-
Date Issued
-
2006
-
Identifier
-
CFE0001168, ucf:46873
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0001168
-
-
Title
-
HOW DEFENDANT CHARACTERISTICS AFFECT SENTENCING AND CONVICTION IN THE US.
-
Creator
-
Kuenzli, Payton, Edwards, Barry, University of Central Florida
-
Abstract / Description
-
This research study analyzes whether or not there is any relationship between sentencing and conviction and certain defendant characteristics in the US legal system. In the midst of a time where the nation is strongly divided politically, the topic is often the center of research projects and discussions in academic journals. Specifically, this research explores the 3 characteristics- race, gender, and socioeconomic status. Within this article, multiple case studies from other journals are...
Show moreThis research study analyzes whether or not there is any relationship between sentencing and conviction and certain defendant characteristics in the US legal system. In the midst of a time where the nation is strongly divided politically, the topic is often the center of research projects and discussions in academic journals. Specifically, this research explores the 3 characteristics- race, gender, and socioeconomic status. Within this article, multiple case studies from other journals are cited in which research and experiments have suggested that these factors do have influence on both whether or not a defendant gets convicted or for how long the defendant is sentenced. With these cases in mind, we try to test the theory for ourselves in a survey experiment amongst college students. The survey tests cases with instances of academic dishonesty in university with the defendant characteristics being manipulated for race, gender, and socioeconomic status. However, the results were inconclusive of any sort of link between those characteristics and the "sentencing" in the study.
Show less
-
Date Issued
-
2018
-
Identifier
-
CFH2000334, ucf:45740
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFH2000334
-
-
Title
-
A METHOD FOR DETERMINING DAMAGE WITHIN HISTORIC CEMETERIES: A FIRST STEP FOR DIGITAL HERITAGE.
-
Creator
-
Malcolm, Justin E, Branting, Scott, University of Central Florida
-
Abstract / Description
-
While it is true that historic cemeteries are places that contain a wealth of knowledge about the history of a community they are sometimes not well maintained. The information within can be lost as grave-markers are damaged either by natural causes or human interaction. In larger cemeteries preserving these significant places can sometimes be difficult due to a number of different factors. Therefore focusing preservation efforts on specific locations where damage is more likely to occur is...
Show moreWhile it is true that historic cemeteries are places that contain a wealth of knowledge about the history of a community they are sometimes not well maintained. The information within can be lost as grave-markers are damaged either by natural causes or human interaction. In larger cemeteries preserving these significant places can sometimes be difficult due to a number of different factors. Therefore focusing preservation efforts on specific locations where damage is more likely to occur is crucial to ensure that the monuments that are the most at risk are preserved. One possible way of accomplishing this is through the utilization of a geographic information system (GIS) to determine the shortest distance path an individual may take to reach a specific grave-marker. This can be accomplished by conducting a near analysis between an origin point and every grave-marker. These paths would also show each grave-marker that an individual passes indicating the potential for purposeful or accidental interaction. With this information efforts such as photogrammetry can be applied effectively for digital heritage preservation. Such methods would permit individuals to manipulate three-dimensional representations of grave-markers in order to preserve a large portion of the information it contains.
Show less
-
Date Issued
-
2018
-
Identifier
-
CFH2000428, ucf:45784
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFH2000428
-
-
Title
-
CONTROLLING RANDOMNESS: USING PROCEDURAL GENERATION TO INFLUENCE PLAYER UNCERTAINTY IN VIDEO GAMES.
-
Creator
-
Fort, Travis, McDaniel, Rudy, University of Central Florida
-
Abstract / Description
-
As video games increase in complexity and length, the use of automatic, or procedural, content generation has become a popular way to reduce the stress on game designers. However, the usage of procedural generation has certain consequences; in many instances, what the computer generates is uncertain to the designer. The intent of this thesis is to demonstrate how procedural generation can be used to intentionally affect the embedded randomness of a game system, enabling game designers to...
Show moreAs video games increase in complexity and length, the use of automatic, or procedural, content generation has become a popular way to reduce the stress on game designers. However, the usage of procedural generation has certain consequences; in many instances, what the computer generates is uncertain to the designer. The intent of this thesis is to demonstrate how procedural generation can be used to intentionally affect the embedded randomness of a game system, enabling game designers to influence the level of uncertainty a player experiences in a nuanced way. This control affords game designers direct control over complex problems like dynamic difficulty adjustment, pacing, or accessibility. Game design will be examined from the perspective of uncertainty and how procedural generation can be used to intentionally add or reduce uncertainty. Various procedural generation techniques will be discussed alongside the types of uncertainty, using case studies to demonstrate the principles in action.
Show less
-
Date Issued
-
2015
-
Identifier
-
CFH0004772, ucf:45386
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFH0004772
-
-
Title
-
SMALL-SCALE HYBRID ALTERNATIVE ENERGY MAXIMIZER FOR WIND TURBINES AND PHOTOVOLTAIC PANELS.
-
Creator
-
Kerley, Ross, Batarseh, Issa, University of Central Florida
-
Abstract / Description
-
This thesis describes the creation of a small-scale Hybrid Power System (HPS) that maximizes energy from a wind turbine and photovoltaic array. Small-scale HPS are becoming an increasingly viable energy solution as fossil fuel prices rise and more electricity is needed in remote areas. Modern HPS typically employ wind speed sensors and three power stages to extract maximum power. Modern systems also use passive rectifiers to convert AC from the wind turbine to DC that is usable by power...
Show moreThis thesis describes the creation of a small-scale Hybrid Power System (HPS) that maximizes energy from a wind turbine and photovoltaic array. Small-scale HPS are becoming an increasingly viable energy solution as fossil fuel prices rise and more electricity is needed in remote areas. Modern HPS typically employ wind speed sensors and three power stages to extract maximum power. Modern systems also use passive rectifiers to convert AC from the wind turbine to DC that is usable by power electronics. This passive system inefficiently wastes power and introduces damaging harmonic noise to the wind turbine. The HPS described in this thesis does not require external wind speed sensors, and has independent wind and solar Maximum Power Point Tracking (MPPT). It converts AC from the wind turbine to DC with a Vienna rectifier that can be controlled to improve efficiency, allow MPPT, and allow Power Factor Correction (PFC). PFC all but eliminates the harmonic noise that can damage the wind turbine. A prototype HPS was built and evaluated that combines the two renewable sources in such a way that only two power stages are necessary, the Vienna rectifier and a step-down converter. This thesis describes the prototype and reports the results obtained.
Show less
-
Date Issued
-
2011
-
Identifier
-
CFH0004087, ucf:44799
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFH0004087
-
-
Title
-
GENERATION AND THE GOOGLE EFFECT: TRANSACTIVE MEMORY SYSTEM PREFERENCE ACROSS AGE.
-
Creator
-
Siler, Jessica, Hancock, Peter, University of Central Florida
-
Abstract / Description
-
A transactive memory system (TMS) is a means by which people may store information externally; in such a system the task of remembering is offloaded by remembering where information is located, rather than remembering the information itself. As Sparrow et al. (2011) suggest in the article Google Effects on Memory: Cognitive Consequences of Having Information at Our Fingertips, people are beginning to use the internet and computers as a TMS, and this use is changing the way people encounter...
Show moreA transactive memory system (TMS) is a means by which people may store information externally; in such a system the task of remembering is offloaded by remembering where information is located, rather than remembering the information itself. As Sparrow et al. (2011) suggest in the article Google Effects on Memory: Cognitive Consequences of Having Information at Our Fingertips, people are beginning to use the internet and computers as a TMS, and this use is changing the way people encounter and treat information. The purpose of this thesis is to investigate whether preference for TMS type (either with books or with computers) varies across age groups. An interaction between TMS preference and age was hypothesized. Before the onset of the internet age, information was primarily found in books and other print materials whereas now the internet is more frequently used, thus this shift in thinking and habit across generations was expected to emerge in the data. The study yielded a total of 51 participants, 32 from the young age group (ages 18-24) and 19 from the old (ages 61-81). A modified Stroop task and question blocks (for priming purposes) were employed to examine whether people are prone to think of book- or computer-related sources when in search of information. Also, a "Look up or Learn" tendencies survey was used to better understand how people decide whether certain information should be learned or left to be "looked up" later (Yacci & Rosanski, 2012). The mixed ANOVA did not reveal main effects for question difficulty or TMS type, nor was an interaction with age found. The results were not consistent with those of Sparrow et al. (2011) and did not show significance for TMS preference. Future studies should continue to examine the Google effect and TMS preference, as it bears important applications for a number of fields.
Show less
-
Date Issued
-
2013
-
Identifier
-
CFH0004473, ucf:45118
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFH0004473
-
-
Title
-
MULTI-TIERED SYSTEM OF SUPPORTS IN FLORIDA: EXPLORING THE KNOWLEDGE OF PARENTS WITHIN THE MTSS PROCESS.
-
Creator
-
Troisi, Stephanie, Little, Mary, University of Central Florida
-
Abstract / Description
-
In the American public school system, as of 2011, over 8% of students are placed in special education programs. To provide early intervention for struggling students before placement into special education services, three-tier model called Response to Intervention (RtI) was put into effect (FDOE, 2009). RtI (currently known as, Multi-Tiered System of Support-MTSS) is a multi-tiered system for struggling learners that provides increasingly intense levels of academic interventions and...
Show moreIn the American public school system, as of 2011, over 8% of students are placed in special education programs. To provide early intervention for struggling students before placement into special education services, three-tier model called Response to Intervention (RtI) was put into effect (FDOE, 2009). RtI (currently known as, Multi-Tiered System of Support-MTSS) is a multi-tiered system for struggling learners that provides increasingly intense levels of academic interventions and assessment (Bryd, 2011). Early intervention is a set of services for students who are at risk of, or who currently have, developmental delays or social emotional problems (Guralnick 2005). MTSS focuses on six core components: (1) evidence-based curriculum, instruction, intervention, and extension; (2) assessment and progress monitoring; (3) data-based decision making; (4) leadership; (5) family, school and community partnerships; and (6) cultural responsivity (Kashima, Schleich, & Spradlin, 2009). The goal of this research is to gain a clearer understanding of parents' perception of the MTSS process, their knowledge of the MTSS process, and their involvement in school-based reading interventions for their children who are receiving intensive interventions at the UCF Reading Clinic. I discovered that overall there was a dissatisfaction with both the communication between the parents and school, and the support that is provided for students. The majority of the parents surveyed recognized the term MTSS but they lacked a deep understanding of the process. Overall, there seemed to be a lack of understanding about how MTSS related to their student and what it meant for their child's education.
Show less
-
Date Issued
-
2014
-
Identifier
-
CFH0004699, ucf:45251
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFH0004699
-
-
Title
-
SOLITARY WAVE FAMILIES IN TWO NON-INTEGRABLE MODELS USING REVERSIBLE SYSTEMS THEORY.
-
Creator
-
Leto, Jonathan, Choudhury, S. Roy, University of Central Florida
-
Abstract / Description
-
In this thesis, we apply a recently developed technique to comprehensively categorize all possible families of solitary wave solutions in two models of topical interest. The models considered are: a) the Generalized Pochhammer-Chree Equations, which govern the propagation of longitudinal waves in elastic rods, and b) a generalized microstructure PDE. Limited analytic results exist for the occurrence of one family of solitary wave solutions for each of these equations. Since, as mentioned...
Show moreIn this thesis, we apply a recently developed technique to comprehensively categorize all possible families of solitary wave solutions in two models of topical interest. The models considered are: a) the Generalized Pochhammer-Chree Equations, which govern the propagation of longitudinal waves in elastic rods, and b) a generalized microstructure PDE. Limited analytic results exist for the occurrence of one family of solitary wave solutions for each of these equations. Since, as mentioned above, solitary wave solutions often play a central role in the long-time evolution of an initial disturbance, we consider such solutions of both models here (via the normal form approach) within the framework of reversible systems theory. Besides confirming the existence of the known family of solitary waves for each model, we find a continuum of delocalized solitary waves (or homoclinics to small-amplitude periodic orbits). On isolated curves in the relevant parameter region, the delocalized waves reduce to genuine embedded solitons. For the microstructure equation, the new family of solutions occur in regions of parameter space distinct from the known solitary wave solutions and are thus entirely new. Directions for future work, including the dynamics of each family of solitary waves using exponential asymptotics techniques, are also mentioned.
Show less
-
Date Issued
-
2008
-
Identifier
-
CFE0002151, ucf:47930
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0002151
-
-
Title
-
SUSTAINABLE FAULT-HANDLING OF RECONFIGURABLE LOGIC USING THROUGHPUT-DRIVEN ASSESSMENT.
-
Creator
-
Sharma, Carthik, DeMara, Ronald, University of Central Florida
-
Abstract / Description
-
A sustainable Evolvable Hardware (EH) system is developed for SRAM-based reconfigurable Field Programmable Gate Arrays (FPGAs) using outlier detection and group testing-based assessment principles. The fault diagnosis methods presented herein leverage throughput-driven, relative fitness assessment to maintain resource viability autonomously. Group testing-based techniques are developed for adaptive input-driven fault isolation in FPGAs, without the need for exhaustive testing or coding-based...
Show moreA sustainable Evolvable Hardware (EH) system is developed for SRAM-based reconfigurable Field Programmable Gate Arrays (FPGAs) using outlier detection and group testing-based assessment principles. The fault diagnosis methods presented herein leverage throughput-driven, relative fitness assessment to maintain resource viability autonomously. Group testing-based techniques are developed for adaptive input-driven fault isolation in FPGAs, without the need for exhaustive testing or coding-based evaluation. The techniques maintain the device operational, and when possible generate validated outputs throughout the repair process. Adaptive fault isolation methods based on discrepancy-enabled pair-wise comparisons are developed. By observing the discrepancy characteristics of multiple Concurrent Error Detection (CED) configurations, a method for robust detection of faults is developed based on pairwise parallel evaluation using Discrepancy Mirror logic. The results from the analytical FPGA model are demonstrated via a self-healing, self-organizing evolvable hardware system. Reconfigurability of the SRAM-based FPGA is leveraged to identify logic resource faults which are successively excluded by group testing using alternate device configurations. This simplifies the system architect's role to definition of functionality using a high-level Hardware Description Language (HDL) and system-level performance versus availability operating point. System availability, throughput, and mean time to isolate faults are monitored and maintained using an Observer-Controller model. Results are demonstrated using a Data Encryption Standard (DES) core that occupies approximately 305 FPGA slices on a Xilinx Virtex-II Pro FPGA. With a single simulated stuck-at-fault, the system identifies a completely validated replacement configuration within three to five positive tests. The approach demonstrates a readily-implemented yet robust organic hardware application framework featuring a high degree of autonomous self-control.
Show less
-
Date Issued
-
2008
-
Identifier
-
CFE0002329, ucf:47813
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0002329
-
-
Title
-
METADATA AND DATA MANAGEMENT IN HIGH PERFORMANCE FILE AND STORAGE SYSTEMS.
-
Creator
-
Gu, Peng, Wang, Jun, University of Central Florida
-
Abstract / Description
-
With the advent of emerging "e-Science" applications, today's scientific research increasingly relies on petascale-and-beyond computing over large data sets of the same magnitude. While the computational power of supercomputers has recently entered the era of petascale, the performance of their storage system is far lagged behind by many orders of magnitude. This places an imperative demand on revolutionizing their underlying I/O systems, on which the management of both metadata and data...
Show moreWith the advent of emerging "e-Science" applications, today's scientific research increasingly relies on petascale-and-beyond computing over large data sets of the same magnitude. While the computational power of supercomputers has recently entered the era of petascale, the performance of their storage system is far lagged behind by many orders of magnitude. This places an imperative demand on revolutionizing their underlying I/O systems, on which the management of both metadata and data is deemed to have significant performance implications. Prefetching/caching and data locality awareness optimizations, as conventional and effective management techniques for metadata and data I/O performance enhancement, still play their crucial roles in current parallel and distributed file systems. In this study, we examine the limitations of existing prefetching/caching techniques and explore the untapped potentials of data locality optimization techniques in the new era of petascale computing. For metadata I/O access, we propose a novel weighted-graph-based prefetching technique, built on both direct and indirect successor relationship, to reap performance benefit from prefetching specifically for clustered metadata serversan arrangement envisioned necessary for petabyte scale distributed storage systems. For data I/O access, we design and implement Segment-structured On-disk data Grouping and Prefetching (SOGP), a combined prefetching and data placement technique to boost the local data read performance for parallel file systems, especially for those applications with partially overlapped access patterns. One high-performance local I/O software package in SOGP work for Parallel Virtual File System in the number of about 2000 C lines was released to Argonne National Laboratory in 2007 for potential integration into the production mode.
Show less
-
Date Issued
-
2008
-
Identifier
-
CFE0002251, ucf:47826
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0002251
-
-
Title
-
Assessing the Impact of Multi-variate Steering-rate Vehicle Control on Driver Performance in a Simulation Framework.
-
Creator
-
Xynidis, Michael, Morrow, Patricia Bockelman, Karwowski, Waldemar, Martin, Glenn, O'Neal, Thomas, Xanthopoulos, Petros, Mouloua, Mustapha, University of Central Florida
-
Abstract / Description
-
When a driver turns a steering-wheel, he or she normally expects the vehicle's steering system to communicate an equivalent amount of signal to the road-wheels. This relationship is linear and occurs regardless of the steering-wheel's position within its rotational travel. The linear steering paradigm in passenger vehicles has gone largely unchanged since mass production of passenger vehicles began in 1901. However, as more electronically-controlled steering systems appear in conjunction with...
Show moreWhen a driver turns a steering-wheel, he or she normally expects the vehicle's steering system to communicate an equivalent amount of signal to the road-wheels. This relationship is linear and occurs regardless of the steering-wheel's position within its rotational travel. The linear steering paradigm in passenger vehicles has gone largely unchanged since mass production of passenger vehicles began in 1901. However, as more electronically-controlled steering systems appear in conjunction with development of autonomous steering functions in vehicles, an opportunity to advance the existing steering paradigms arises. The following framework takes a human-factors approach toward examining and evaluating alternative steering systems by using Modeling and Simulation methods to track and score human performance.Present conventional steering systems apply a linear relationship between the steering-wheel and the road wheels of a vehicle. The rotational travel of the steering-wheel is 900(&)deg; and requires two-and-a-half revolutions to travel from end-stop to opposite end-stop. The experimental steering system modeled and employed in this study applies a dynamic curve response to the steering input within a shorter, 225(&)deg; rotational travel. Accommodation variances, based on vehicle speed and steering-wheel rotational position and acceleration, moderate the apparent steering input to augment a more-practical, effective steering rate. This novel model follows a paradigm supporting the full range of steering-wheel actuation without necessitating hand repositioning or the removal of the driver's hands from the steering-wheel during steering maneuvers.In order to study human performance disparities between novel and conventional steering models, a custom simulator was constructed and programmed to render representative models in a test scenario. Twenty-seven males and twenty-seven females, ranging from the ages of eighteen to sixty-five were tested and scored using the driving simulator that presented two successive driving test vignettes: One vignette using conventional 900(&)deg; steering with linear response and the other employing the augmented 225(&)deg; multivariate, non-linear steering.The results from simulator testing suggest that both males and females perform better with the novel system, supporting the hypothesis that drivers of either gender perform better with a system augmented with 225(&)deg; multivariate, non-linear steering than with a conventional steering system. Further analysis of the simulated-driving scores indicates performance parity between male and female participants, supporting the hypothesis positing no significant difference in driver performance between male and female drivers using the augmented steering system. Finally, composite data from written questionnaires support the hypothesis that drivers will prefer driving the augmented system over conventional steering.These collective findings support justification for testing and refining novel steering systems using Modeling and Simulation methods. As a product of this particular study, a tested and open-sourced simulation framework now exists such that researchers and automotive designers can develop, as well as evaluate their own steering-oriented products within a valid human-factors construct. The open-source nature of this framework implies a commonality by which otherwise disparate research and development work can be associated.Extending this framework beyond basic investigation to reach applications requiring more-specialized parameters may even impact drivers having special needs. For example, steering-system functional characteristics could be comparatively optimized to accommodate individuals afflicted with upper-body deficits or limited use of either or both arms. Moreover, the combined human-factors and open-source approaches distinguish the products of this research as a common and extensible platform by which purposeful automotive-industry improvements can be realized(-)contrasted with arbitrary improvements that might be brought about predominantly to showcase technological advancements.
Show less
-
Date Issued
-
2018
-
Identifier
-
CFE0007420, ucf:52706
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0007420
-
-
Title
-
Methods to Calculate Cut Volumes for Fault Trees with Dependencies Induced by Spatial Locations.
-
Creator
-
Hanes, Phillip, Wiegand, Rudolf, Wu, Annie, DeMara, Ronald, Song, Zixia, University of Central Florida
-
Abstract / Description
-
Fault tree analysis (FTA) is used to find and mitigate vulnerabilities in systems based on their constituent components. Methods exist to efficiently find minimal cut sets (MCS), which are combinations of components whose failure causes the overall system to fail. However, traditional FTA ignores the physical location of the components. Components in close proximity to each other could be defeated by a single event with a radius of effect, such as an explosion or fire. Events such as the...
Show moreFault tree analysis (FTA) is used to find and mitigate vulnerabilities in systems based on their constituent components. Methods exist to efficiently find minimal cut sets (MCS), which are combinations of components whose failure causes the overall system to fail. However, traditional FTA ignores the physical location of the components. Components in close proximity to each other could be defeated by a single event with a radius of effect, such as an explosion or fire. Events such as the Deepwater Horizon explosion and subsequent oil spill demonstrate the potentially devastating risk posed by such spatial dependencies. This motivates the search for techniques to identify this type of vulnerability. Adding physical locations to the fault tree structure can help identify possible points of failure in the overall system caused by localized disasters. Since existing FTA methods cannot address these concerns, using this information requires extending existing solution methods or developing entirely new ones.A problem complicating research in FTA is the lack of benchmark problems for evaluating methods, especially for fault trees over one hundred components. This research presents a method of using Lindenmeyer systems (L-systems) to generate fault trees that are reproducible, capable of producing fault trees with similar properties to real-world designs, and scalable while maintaining predictable structural properties. This approach will be useful for testing and analyzing different methodologies for FTA tasks at different scales and under different conditions.Using a set of benchmark fault trees derived from L-systems, three approaches to finding these vulnerabilities were explored in this research. These approaches were compared by defining a metric called (")minimal cut volumes(") (MCV) for describing volumes of effect that defeat the system. Since no existing methods are known for solving this problem, the methods are compared to each other to evaluate performance.1) The control method executes traditional FTA software to find minimal cut sets (MCS), then extends this approach by searching for clusters in the resulting MCS to find MCV.2) The next method starts by searching for clusters of components in the three dimensional space, then evaluates combinations of clusters to find MCV that defeat the system.3) The last method uses an evolutionary algorithm to search the space directly by selecting center points, then using the radius of the smallest sphere(s) as the fitness value for identifying MCV.Results generated using each method are presented. The performance of the methods are compared to the control method and their utilities evaluated accordingly.
Show less
-
Date Issued
-
2018
-
Identifier
-
CFE0007403, ucf:52075
-
Format
-
Document (PDF)
-
PURL
-
http://purl.flvc.org/ucf/fd/CFE0007403
Pages