Current Search: design process (x)
View All Items
- Title
- SAVAGE IN LIMBO: A STUDY IN LIGHTING DESIGN.
- Creator
-
Haines, Kenneth, Perry, Charles, University of Central Florida
- Abstract / Description
-
Designing the elements of a theatrical production is a unique and often experimental process. This process changes from show to show, and it can be difficult for a viewer to differentiate mistakes from design choices without a background in lighting. That is why it is important to take a look at the design process step by step. Two goals I strove for when designing Savage In Limbo were, how the director's concept blended with a design and if the integrity of the designer's vision was evident...
Show moreDesigning the elements of a theatrical production is a unique and often experimental process. This process changes from show to show, and it can be difficult for a viewer to differentiate mistakes from design choices without a background in lighting. That is why it is important to take a look at the design process step by step. Two goals I strove for when designing Savage In Limbo were, how the director's concept blended with a design and if the integrity of the designer's vision was evident on stage. To explore these goals, script analysis and consideration of the director's vision are two very important processes. Additionally, an exploration of the design process will better describe the growth and personal achievements of the design. This thesis will show the process of the lighting design for The University of Central Florida's 2011 production of John Patrick Shanley's Savage In Limbo. The project will highlight the design achievements and the goals explained previously, and create a formal dialogue on this specific design in order to provide insight into the process. When analyzing the design it was important that I assessed the process as well as the product by looking at whether the design met the expectations of the script and audience. This thesis will also explore how my past experiences, education and current skill level have prepared me for this design process in order to create a guideline for others interested in the development of knowledge needed for design.
Show less - Date Issued
- 2012
- Identifier
- CFH0004206, ucf:44969
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFH0004206
- Title
- STUDY OF DESIGN FOR RELIABILITY OF RF AND ANALOG CIRCUITS.
- Creator
-
Tang, Hongxia, Yuan, Jiann-Shiun, Wu, Xinzhang, Sundaram, Kalpathy, Chow, Lee, University of Central Florida
- Abstract / Description
-
Due to continued device dimensions scaling, CMOS transistors in the nanometer regime have resulted in major reliability and variability challenges. Reliability issues such as channel hot electron injection, gate dielectric breakdown, and negative bias temperature instability (NBTI) need to be accounted for in the design of robust RF circuits. In addition, process variations in the nanoscale CMOS transistors are another major concern in today's circuits design.An adaptive gate-source biasing...
Show moreDue to continued device dimensions scaling, CMOS transistors in the nanometer regime have resulted in major reliability and variability challenges. Reliability issues such as channel hot electron injection, gate dielectric breakdown, and negative bias temperature instability (NBTI) need to be accounted for in the design of robust RF circuits. In addition, process variations in the nanoscale CMOS transistors are another major concern in today's circuits design.An adaptive gate-source biasing scheme to improve the RF circuit reliability is presented in this work. The adaptive method automatically adjusts the gate-source voltage to compensate the reduction in drain current subjected to various device reliability mechanisms. A class-AB RF power amplifier shows that the use of a source resistance makes the power-added efficiency robust against threshold voltage and mobility variations, while the use of a source inductance is more reliable for the input third-order intercept point.A RF power amplifier with adaptive gate biasing is proposed to improve the circuit device reliability degradation and process variation. The performances of the power amplifier with adaptive gate biasing are compared with those of the power amplifier without adaptive gate biasing technique. The adaptive gate biasing makes the power amplifier more resilient to process variations as well as the device aging such as mobility and threshold voltage degradation. Injection locked voltage-controlled oscillators (VCOs) have been examined. The VCOs are implemented using TSMC 0.18 (&)#181;m mixed-signal CMOS technology. The injection locked oscillators have improved phase noise performance than free running oscillators.A differential Clapp-VCO has been designed and fabricated for the evaluation of hot electron reliability. The differential Clapp-VCO is formed using cross-coupled nMOS transistors, on-chip transformers/inductors, and voltage-controlled capacitors. The experimental data demonstrate that the hot carrier damage increases the oscillation frequency and degrades the phase noise of Clapp-VCO.A p-channel transistor only VCO has been designed for low phase noise. The simulation results show that the phase noise degrades after NBTI stress at elevated temperature. This is due to increased interface states after NBTI stress. The process variability has also been evaluated.
Show less - Date Issued
- 2012
- Identifier
- CFE0004223, ucf:49000
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004223
- Title
- Quality by Design Procedure for Continuous Pharmaceutical Manufacturing: An Integrated Flowsheet Model Approach.
- Creator
-
Vezina, Ashley, Elshennawy, Ahmad, Rabelo, Luis, Karwowski, Waldemar, University of Central Florida
- Abstract / Description
-
Pharmaceutical manufacturing is crucial to global healthcare and requires a higher, more consistent level of quality than any other industry. Yet, the traditional pharmaceutical batch manufacturing has remained largely unchanged in the last fifty years due to high R(&)D costs, shorter patent durations, and regulatory uncertainty. This has led regulatory bodies to promote modernization of manufacturing process to continuous pharmaceutical manufacturing (CPM) by introducing new methodologies...
Show morePharmaceutical manufacturing is crucial to global healthcare and requires a higher, more consistent level of quality than any other industry. Yet, the traditional pharmaceutical batch manufacturing has remained largely unchanged in the last fifty years due to high R(&)D costs, shorter patent durations, and regulatory uncertainty. This has led regulatory bodies to promote modernization of manufacturing process to continuous pharmaceutical manufacturing (CPM) by introducing new methodologies including quality by design, design space, and process analytical technology (PAT). This represents a shift away from the traditional pharmaceutical manufacturing way of thinking towards a risk based approach that promotes increased product and process knowledge through a data-rich environment. While both literature and regulatory bodies acknowledge the need for modernization, manufacturers have been slow to modernize due to uncertainty and lack of confidence in the applications of these methodologies. This paper aims to describe the current applications of QbD principles in literature and the current regulatory environment to identify gaps in literature through leveraging regulatory guidelines and CPM literature. To aid in closing the gap between QbD theory and QbD application, a QbD algorithm for CPM using an integrated flowsheet models is also developed and analyzed. This will help to increase manufacturing confidence in CPM by providing answers to questions about the CPM business case, applications of QbD tools, process validation and sensitivity, and process and equipment characteristics. An integrated flowsheet model will aid in the decision-making process and process optimization, breaking away from ex silico methods extensively covered in literature.
Show less - Date Issued
- 2017
- Identifier
- CFE0006923, ucf:51683
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006923
- Title
- Design Disjunction for Resilient Reconfigurable Hardware.
- Creator
-
Alzahrani, Ahmad, DeMara, Ronald, Yuan, Jiann-Shiun, Lin, Mingjie, Wang, Jun, Turgut, Damla, University of Central Florida
- Abstract / Description
-
Contemporary reconfigurable hardware devices have the capability to achieve high performance, powerefficiency, and adaptability required to meet a wide range of design goals. With scaling challenges facing current complementary metal oxide semiconductor (CMOS), new concepts and methodologies supportingefficient adaptation to handle reliability issues are becoming increasingly prominent. Reconfigurable hardware and their ability to realize self-organization features are expected to play a key...
Show moreContemporary reconfigurable hardware devices have the capability to achieve high performance, powerefficiency, and adaptability required to meet a wide range of design goals. With scaling challenges facing current complementary metal oxide semiconductor (CMOS), new concepts and methodologies supportingefficient adaptation to handle reliability issues are becoming increasingly prominent. Reconfigurable hardware and their ability to realize self-organization features are expected to play a key role in designingfuture dependable hardware architectures. However, the exponential increase in density and complexity of current commercial SRAM-based field-programmable gate arrays (FPGAs) has escalated the overheadassociated with dynamic runtime design adaptation. Traditionally, static modular redundancy techniques areconsidered to surmount this limitation; however, they can incur substantial overheads in both area andpower requirements. To achieve a better trade-off among performance, area, power, and reliability, thisresearch proposes design-time approaches that enable fine selection of redundancy level based on target reliability goals and autonomous adaptation to runtime demands. To achieve this goal, three studies were conducted:First, a graph and set theoretic approach, named Hypergraph-Cover Diversity (HCD), is introduced as a preemptive design technique to shift the dominant costs of resiliency to design-time. In particular, union-freehypergraphs are exploited to partition the reconfigurable resources pool into highly separable subsets ofresources, each of which can be utilized by the same synthesized application netlist. The diverseimplementations provide reconfiguration-based resilience throughout the system lifetime while avoiding thesignificant overheads associated with runtime placement and routing phases. Evaluation on a Motion-JPEGimage compression core using a Xilinx 7-series-based FPGA hardware platform has demonstrated thepotential of the proposed FT method to achieve 37.5% area saving and up to 66% reduction in powerconsumption compared to the frequently-used TMR scheme while providing superior fault tolerance.Second, Design Disjunction based on non-adaptive group testing is developed to realize a low-overheadfault tolerant system capable of handling self-testing and self-recovery using runtime partial reconfiguration.Reconfiguration is guided by resource grouping procedures which employ non-linear measurements given by the constructive property of f-disjunctness to extend runtime resilience to a large fault space and realize a favorable range of tradeoffs. Disjunct designs are created using the mosaic convergence algorithmdeveloped such that at least one configuration in the library evades any occurrence of up to d resource faults, where d is lower-bounded by f. Experimental results for a set of MCNC and ISCAS benchmarks havedemonstrated f-diagnosability at the individual slice level with average isolation resolution of 96.4% (94.4%) for f=1 (f=2) while incurring an average critical path delay impact of only 1.49% and area cost roughly comparable to conventional 2-MR approaches. Finally, the proposed Design Disjunction method is evaluated as a design-time method to improve timing yield in the presence of large random within-die (WID) process variations for application with a moderately high production capacity.
Show less - Date Issued
- 2015
- Identifier
- CFE0006250, ucf:51086
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006250
- Title
- LIGHTING DESIGN FOR FROM SUN TO SUN: A DAY IN A RAILROAD CAMP.
- Creator
-
Szewczyk, Nathan, Scott, Bert, University of Central Florida
- Abstract / Description
-
In this thesis the notion of a theoretical approach to the beginning stages of designing lighting for a theatrical production will be discussed. The topic being researched is: How a theoretical approach to entering the design process will enhance the final lighting design. The target audience for this study is theatrical lighting designers. A theoretical approach, in this case to the beginning of the design process, could be described as utilizing current dramatic theories to develop a better...
Show moreIn this thesis the notion of a theoretical approach to the beginning stages of designing lighting for a theatrical production will be discussed. The topic being researched is: How a theoretical approach to entering the design process will enhance the final lighting design. The target audience for this study is theatrical lighting designers. A theoretical approach, in this case to the beginning of the design process, could be described as utilizing current dramatic theories to develop a better understanding for the design of this production. In order to better understand this topic one would need to know how the process of lighting design is typically created and where the theoretical approach is implemented. An issue with this approach is that the short period allowed for the design process does not allow sufficient time to utilize a theoretical approach in a real world setting. A way of determining if this process is effective is through personal self review. Journaling and discussion with my advisor for this production will be the method of data collection. The method of validation will be a self reflection at the end of the final performance. An issue with the collection process is its reliance on personal opinions, including the author's. There are no ethical issues relating to this study. When applied, a theoretical approach to the design process will enhance the quality of the final lighting design through allowing the designer to be better prepared for a specific scene that he/she is struggling with.
Show less - Date Issued
- 2011
- Identifier
- CFE0003609, ucf:48874
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0003609
- Title
- An Integrated Design for Six Sigma-Based Framework To Align Strategy, New Process Development, and Customer Requirements In The Airlines Industry.
- Creator
-
Alghamdi, Mohammed, Elshennawy, Ahmad, Rabelo, Luis, Lee, Gene, Ahmad, Ali, University of Central Florida
- Abstract / Description
-
When organizations create new strategy maps, key new processes are often identified. This is important for organizations to stay competitive in the global marketplace. This document describes the development, implementation, and validation of a framework that properly aligns and links an organization's strategy and new process development. The proposed framework integrates the Balanced Scorecard management system (BSC) and the Design for Six Sigma (DFSS) methodology, leveraging their...
Show moreWhen organizations create new strategy maps, key new processes are often identified. This is important for organizations to stay competitive in the global marketplace. This document describes the development, implementation, and validation of a framework that properly aligns and links an organization's strategy and new process development. The proposed framework integrates the Balanced Scorecard management system (BSC) and the Design for Six Sigma (DFSS) methodology, leveraging their strengths, overcoming weaknesses, and identifying lessons learned to help bridge the gap between strategy development and execution. The critical-to-quality conceptual model is used as an integrative component for the framework. Literature search has resulted in little or no research into the development of similar frameworks. To demonstrate and evaluate the effectiveness of the framework in a real-world environment, a case study is carried out and implemented successfully. As the case study progressed, cycle time as a performance indicator was estimated and showed progression towards the targeted strategic objective. The developed framework helps decision-makers seamlessly transit from a strategic position to process development linking strategic objectives to the critical-to-quality features. This comprehensive framework can help move organizations from where they currently are to where they want to be, laying the background needed for customer satisfaction and breakthrough performance.
Show less - Date Issued
- 2016
- Identifier
- CFE0006246, ucf:51079
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006246
- Title
- Secondary World: The Limits of Ludonarrative.
- Creator
-
Dannelly, David, Adams, JoAnne, Price, Mark, Poindexter, Carla, Kovach, Keith, University of Central Florida
- Abstract / Description
-
Secondary World: The Limits of Ludonarrative is a series of short narrative animations that are a theoretical treatise on the limitations of western storytelling in video games. The series covers specific topics relating to film theory, game design and art theory: specifically those associated with Gilles Deleuze, Jean Baudrillard, Jay Bolter, Richard Grusin and Andy Clark. The use of imagery, editing and presentation is intended to physically represent an extension of myself and my thinking...
Show moreSecondary World: The Limits of Ludonarrative is a series of short narrative animations that are a theoretical treatise on the limitations of western storytelling in video games. The series covers specific topics relating to film theory, game design and art theory: specifically those associated with Gilles Deleuze, Jean Baudrillard, Jay Bolter, Richard Grusin and Andy Clark. The use of imagery, editing and presentation is intended to physically represent an extension of myself and my thinking process and which are united through the common thread of my personal feelings, thoughts and experiences in the digital age.
Show less - Date Issued
- 2014
- Identifier
- CFE0005155, ucf:50704
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005155
- Title
- Adaptive Architectural Strategies for Resilient Energy-Aware Computing.
- Creator
-
Ashraf, Rizwan, DeMara, Ronald, Lin, Mingjie, Wang, Jun, Jha, Sumit, Johnson, Mark, University of Central Florida
- Abstract / Description
-
Reconfigurable logic or Field-Programmable Gate Array (FPGA) devices have the ability to dynamically adapt the computational circuit based on user-specified or operating-condition requirements. Such hardware platforms are utilized in this dissertation to develop adaptive techniques for achieving reliable and sustainable operation while autonomously meeting these requirements. In particular, the properties of resource uniformity and in-field reconfiguration via on-chip processors are exploited...
Show moreReconfigurable logic or Field-Programmable Gate Array (FPGA) devices have the ability to dynamically adapt the computational circuit based on user-specified or operating-condition requirements. Such hardware platforms are utilized in this dissertation to develop adaptive techniques for achieving reliable and sustainable operation while autonomously meeting these requirements. In particular, the properties of resource uniformity and in-field reconfiguration via on-chip processors are exploited to implement Evolvable Hardware (EHW). EHW utilize genetic algorithms to realize logic circuits at runtime, as directed by the objective function. However, the size of problems solved using EHW as compared with traditional approaches has been limited to relatively compact circuits. This is due to the increase in complexity of the genetic algorithm with increase in circuit size. To address this research challenge of scalability, the Netlist-Driven Evolutionary Refurbishment (NDER) technique was designed and implemented herein to enable on-the-fly permanent fault mitigation in FPGA circuits. NDER has been shown to achieve refurbishment of relatively large sized benchmark circuits as compared to related works. Additionally, Design Diversity (DD) techniques which are used to aid such evolutionary refurbishment techniques are also proposed and the efficacy of various DD techniques is quantified and evaluated.Similarly, there exists a growing need for adaptable logic datapaths in custom-designed nanometer-scale ICs, for ensuring operational reliability in the presence of Process, Voltage, and Temperature (PVT) and, transistor-aging variations owing to decreased feature sizes for electronic devices. Without such adaptability, excessive design guardbands are required to maintain the desired integration and performance levels. To address these challenges, the circuit-level technique of Self-Recovery Enabled Logic (SREL) was designed herein. At design-time, vulnerable portions of the circuit identified using conventional Electronic Design Automation tools are replicated to provide post-fabrication adaptability via intelligent techniques. In-situ timing sensors are utilized in a feedback loop to activate suitable datapaths based on current conditions that optimize performance and energy consumption. Primarily, SREL is able to mitigate the timing degradations caused due to transistor aging effects in sub-micron devices by reducing the stress induced on active elements by utilizing power-gating. As a result, fewer guardbands need to be included to achieve comparable performance levels which leads to considerable energy savings over the operational lifetime.The need for energy-efficient operation in current computing systems has given rise to Near-Threshold Computing as opposed to the conventional approach of operating devices at nominal voltage. In particular, the goal of exascale computing initiative in High Performance Computing (HPC) is to achieve 1 EFLOPS under the power budget of 20MW. However, it comes at the cost of increased reliability concerns, such as the increase in performance variations and soft errors. This has given rise to increased resiliency requirements for HPC applications in terms of ensuring functionality within given error thresholds while operating at lower voltages. My dissertation research devised techniques and tools to quantify the effects of radiation-induced transient faults in distributed applications on large-scale systems. A combination of compiler-level code transformation and instrumentation are employed for runtime monitoring to assess the speed and depth of application state corruption as a result of fault injection. Finally, fault propagation models are derived for each HPC application that can be used to estimate the number of corrupted memory locations at runtime. Additionally, the tradeoffs between performance and vulnerability and the causal relations between compiler optimization and application vulnerability are investigated.
Show less - Date Issued
- 2015
- Identifier
- CFE0006206, ucf:52889
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006206