Current Search: Myers, Michael (x)
View All Items
- Title
- THE EARLY MODERN SPACE: (CARTOGRAPHIC) LITERATURE AND THE AUTHOR IN PLACE.
- Creator
-
Myers, Michael, Gleyzon, Francois-Xavier, University of Central Florida
- Abstract / Description
-
In geography, maps are a tool of placement which locate both the cartographer and the territory made cartographic. In order to place objects in space, the cartographer inserts his own judgment into the scheme of his design. During the Early Modern period, maps were no longer suspicious icons as they were in the Middle Ages and not yet products of science, but subjects of discourse and works of art. The image of a cartographer's territory depended on his vision�both the nature and placement of...
Show moreIn geography, maps are a tool of placement which locate both the cartographer and the territory made cartographic. In order to place objects in space, the cartographer inserts his own judgment into the scheme of his design. During the Early Modern period, maps were no longer suspicious icons as they were in the Middle Ages and not yet products of science, but subjects of discourse and works of art. The image of a cartographer's territory depended on his vision�both the nature and placement of his gaze�and the product reflected that author's judgment. This is not a study of maps as such but of Early Modern literature, cartographic by nature�the observations of the author were the motif of its design. However, rather than concretize observational judgment through art, the Early Modern literature discussed asserts a reverse relation�the generation of the material which may be observed, the reality, by the views of authors. Spatiality is now an emerging philosophical field of study, taking root in the philosophy of Deleuze & Guattari. Using the notion prevalent in both Postmodern and Early Modern spatiality, which makes of perception a collective delusion with its roots in the critique of Kant, this thesis draws a through-line across time, as texts such as Robert Burton's An Anatomy of Melancholy, Thomas More's Utopia, and selections from William Shakespeare display a tendency to remove value from the standard of representation, to replace meaning with cognition and prioritize a view of views over an observable world. Only John Milton approaches perception as possibly referential to objective reality, by re-inserting his ability to observe and exist in that reality, in a corpus which becomes less generative simulations of material than concrete signposts to his judgment in the world.
Show less - Date Issued
- 2015
- Identifier
- CFH0004899, ucf:53148
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFH0004899
- Title
- Complex-valued adaptive digital signal enhancement for applications in wireless communication systems.
- Creator
-
Liu, Ying, Mikhael, Wasfy, Batarseh, Issa, Yang, Thomas, Hunter, Matthew, Haralambous, Michael, Myers, Brent, University of Central Florida
- Abstract / Description
-
In recent decades, the wireless communication industry has attracted a great deal of research efforts to satisfy rigorous performance requirements and preserve high spectral efficiency. Along with this trend, I/Q modulation is frequently applied in modern wireless communications to develop high performance and high data rate systems. This has necessitated the need for applying efficient complex-valued signal processing techniques to highly-integrated, multi-standard receiver devices.In this...
Show moreIn recent decades, the wireless communication industry has attracted a great deal of research efforts to satisfy rigorous performance requirements and preserve high spectral efficiency. Along with this trend, I/Q modulation is frequently applied in modern wireless communications to develop high performance and high data rate systems. This has necessitated the need for applying efficient complex-valued signal processing techniques to highly-integrated, multi-standard receiver devices.In this dissertation, novel techniques for complex-valued digital signal enhancement are presented and analyzed for various applications in wireless communications. The first technique is a unified block processing approach to generate the complex-valued conjugate gradient Least Mean Square (LMS) techniques with optimal adaptations. The proposed algorithms exploit the concept of the complex conjugate gradients to find the orthogonal directions for updating the adaptive filter coefficients at each iteration. Along each orthogonal direction, the presented algorithms employ the complex Taylor series expansion to calculate time-varying convergence factors tailored for the adaptive filter coefficients. The performance of the developed technique is tested in the applications of channel estimation, channel equalization, and adaptive array beamforming. Comparing with the state of the art methods, the proposed techniques demonstrate improved performance and exhibit desirable characteristics for practical use.The second complex-valued signal processing technique is a novel Optimal Block Adaptive algorithm based on Circularity, OBA-C. The proposed OBA-C method compensates for a complex imbalanced signal by restoring its circularity. In addition, by utilizing the complex Taylor series expansion, the OBA-C method optimally updates the adaptive filter coefficients at each iteration. This algorithm can be applied to mitigate the frequency-dependent I/Q mismatch effects in analog front-end. Simulation results indicate that comparing with the existing methods, OBA-C exhibits superior convergence speed while maintaining excellent accuracy. The third technique is regarding interference rejection in communication systems. The research on both LMS and Independent Component Analysis (ICA) based techniques continues to receive significant attention in the area of interference cancellation. The performance of the LMS and ICA based approaches is studied for signals with different probabilistic distributions. Our research indicates that the ICA-based approach works better for super-Gaussian signals, while the LMS-based method is preferable for sub-Gaussian signals. Therefore, an appropriate choice of interference suppression algorithms can be made to satisfy the ever-increasing demand for better performance in modern receiver design.
Show less - Date Issued
- 2012
- Identifier
- CFE0004572, ucf:49192
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004572
- Title
- Post Conversion Correction of Non-Linear Mismatches for Time Interleaved Analog-to-Digital Converters.
- Creator
-
Parkey, Charna, Mikhael, Wasfy, Qu, Zhihua, Georgiopoulos, Michael, Myers, Brent, Wei, Lei, Chester, David, University of Central Florida
- Abstract / Description
-
Time Interleaved Analog-to-Digital Converters (TI-ADCs) utilize an architecture which enables conversion rates well beyond the capabilities of a single converter while preserving most or all of the other performance characteristics of the converters on which said architecture is based. Most of the approaches discussed here are independent of architecture; some solutions take advantage of specific architectures. Chapter 1 provides the problem formulation and reviews the errors found in ADCs as...
Show moreTime Interleaved Analog-to-Digital Converters (TI-ADCs) utilize an architecture which enables conversion rates well beyond the capabilities of a single converter while preserving most or all of the other performance characteristics of the converters on which said architecture is based. Most of the approaches discussed here are independent of architecture; some solutions take advantage of specific architectures. Chapter 1 provides the problem formulation and reviews the errors found in ADCs as well as a brief literature review of available TI-ADC error correction solutions. Chapter 2 presents the methods and materials used in implementation as well as extend the state of the art for post conversion correction. Chapter 3 presents the simulation results of this work and Chapter 4 concludes the work. The contribution of this research is three fold: A new behavioral model was developed in SimulinkTM and MATLABTM to model and test linear and nonlinear mismatch errors emulating the performance data of actual converters. The details of this model are presented as well as the results of cumulant statistical calculations of the mismatch errors which is followed by the detailed explanation and performance evaluation of the extension developed in this research effort. Leading post conversion correction methods are presented and an extension with derivations is presented. It is shown that the data converter subsystem architecture developed is capable of realizing better performance of those currently reported in the literature while having a more efficient implementation.
Show less - Date Issued
- 2015
- Identifier
- CFE0005683, ucf:50171
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005683
- Title
- Time and Space Efficient Techniques for Facial Recognition.
- Creator
-
Alrasheed, Waleed, Mikhael, Wasfy, DeMara, Ronald, Haralambous, Michael, Wei, Lei, Myers, Brent, University of Central Florida
- Abstract / Description
-
In recent years, there has been an increasing interest in face recognition. As a result, many new facial recognition techniques have been introduced. Recent developments in the field of face recognition have led to an increase in the number of available face recognition commercial products. However, Face recognition techniques are currently constrained by three main factors: recognition accuracy, computational complexity, and storage requirements. The problem is that most of the current face...
Show moreIn recent years, there has been an increasing interest in face recognition. As a result, many new facial recognition techniques have been introduced. Recent developments in the field of face recognition have led to an increase in the number of available face recognition commercial products. However, Face recognition techniques are currently constrained by three main factors: recognition accuracy, computational complexity, and storage requirements. The problem is that most of the current face recognition techniques succeed in improving one or two of these factors at the expense of the others.In this dissertation, four novel face recognition techniques that improve the storage and computational requirements of face recognition systems are presented and analyzed. Three of the four novel face recognition techniques to be introduced, namely, Quantized/truncated Transform Domain (QTD), Frequency Domain Thresholding and Quantization (FD-TQ), and Normalized Transform Domain (NTD). All the three techniques utilize the Two-dimensional Discrete Cosine Transform (DCT-II), which reduces the dimensionality of facial feature images, thereby reducing the computational complexity. The fourth novel face recognition technique is introduced, namely, the Normalized Histogram Intensity (NHI). It is based on utilizing the pixel intensity histogram of poses' subimages, which reduces the computational complexity and the needed storage requirements. Various simulation experiments using MATLAB were conducted to test the proposed methods. For the purpose of benchmarking the performance of the proposed methods, the simulation experiments were performed using current state-of-the-art face recognition techniques, namely, Two Dimensional Principal Component Analysis (2DPCA), Two-Directional Two-Dimensional Principal Component Analysis ((2D)^2PCA), and Transform Domain Two Dimensional Principal Component Analysis (TD2DPCA). The experiments were applied to the ORL, Yale, and FERET databases.The experimental results for the proposed techniques confirm that the use of any of the four novel techniques examined in this study results in a significant reduction in computational complexity and storage requirements compared to the state-of-the-art techniques without sacrificing the recognition accuracy.
Show less - Date Issued
- 2013
- Identifier
- CFE0005297, ucf:50566
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005297