Current Search: Statistics (x)
View All Items
Pages
- Title
- MODELING AND CHARACTERIZATIONS OF NEW NOTIONS IN LIFE TESTING WITH STATISTICAL APPLICATIONS.
- Creator
-
Sepehrifar, Mohammad, Ahmad, Ibrahim, University of Central Florida
- Abstract / Description
-
Knowing the class to which a life distribution belongs gives us an idea about the aging of the device or system the life distribution represents, and enables us to compare the aging properties of different systems. This research intends to establish several new nonparametric classes of life distributions defined by the concept of inactivity time of a unit with a guaranteed minimum life length. These classes play an important role in the study of reliability theory, survival analysis,...
Show moreKnowing the class to which a life distribution belongs gives us an idea about the aging of the device or system the life distribution represents, and enables us to compare the aging properties of different systems. This research intends to establish several new nonparametric classes of life distributions defined by the concept of inactivity time of a unit with a guaranteed minimum life length. These classes play an important role in the study of reliability theory, survival analysis, maintenance policies, economics, actuarial sciences and many other applied areas.
Show less - Date Issued
- 2006
- Identifier
- CFE0001316, ucf:47030
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001316
- Title
- NONPARAMETRIC MULTIVARIATE STATISTICAL PROCESS CONTROL USING PRINCIPAL COMPONENT ANALYSIS AND SIMPLICIAL DEPTH.
- Creator
-
Beltran, Luis, Malone, Linda, University of Central Florida
- Abstract / Description
-
Although there has been progress in the area of Multivariate Statistical Process Control (MSPC), there are numerous limitations as well as unanswered questions with the current techniques. MSPC charts plotting Hotelling's T2 require the normality assumption for the joint distribution among the process variables, which is not feasible in many industrial settings, hence the motivation to investigate nonparametric techniques for multivariate data in quality control. In this research, the goal...
Show moreAlthough there has been progress in the area of Multivariate Statistical Process Control (MSPC), there are numerous limitations as well as unanswered questions with the current techniques. MSPC charts plotting Hotelling's T2 require the normality assumption for the joint distribution among the process variables, which is not feasible in many industrial settings, hence the motivation to investigate nonparametric techniques for multivariate data in quality control. In this research, the goal will be to create a systematic distribution-free approach by extending current developments and focusing on the dimensionality reduction using Principal Component Analysis. The proposed technique is different from current approaches given that it creates a nonparametric control chart using robust simplicial depth ranks of the first and last set of principal components to improve signal detection in multivariate quality control with no distributional assumptions. The proposed technique has the advantages of ease of use and robustness in MSPC for monitoring variability and correlation shifts. By making the approach simple to use in an industrial setting, the probability of adoption is enhanced. Improved MSPC can result in a cost savings and improved quality.
Show less - Date Issued
- 2006
- Identifier
- CFE0001065, ucf:46792
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001065
- Title
- POLARIMETRY OF RANDOM FIELDS.
- Creator
-
Ellis, Jeremy, Dogariu, Aristide, University of Central Florida
- Abstract / Description
-
On temporal, spatial and spectral scales which are small enough, all fields are fully polarized. In the optical regime, however, instantaneous fields can rarely be examined, and, instead, only average quantities are accessible. The study of polarimetry is concerned with both the description of electromagnetic fields and the characterization of media a field has interacted with. The polarimetric information is conventionally presented in terms of second order field correlations which are...
Show moreOn temporal, spatial and spectral scales which are small enough, all fields are fully polarized. In the optical regime, however, instantaneous fields can rarely be examined, and, instead, only average quantities are accessible. The study of polarimetry is concerned with both the description of electromagnetic fields and the characterization of media a field has interacted with. The polarimetric information is conventionally presented in terms of second order field correlations which are averaged over the ensemble of field realizations. Motivated by the deficiencies of classical polarimetry in dealing with specific practical situations, this dissertation expands the traditional polarimetric approaches to include higher order field correlations and the description of fields fluctuating in three dimensions. In relation to characterization of depolarizing media, a number of fourth-order correlations are introduced in this dissertation. Measurements of full polarization distributions, and the subsequent evaluation of Stokes vector element correlations and Complex Degree of Mutual Polarization demonstrate the use of these quantities for material discrimination and characterization. Recent advancements in detection capabilities allow access to fields near their sources and close to material boundaries, where a unique direction of propagation is not evident. Similarly, there exist classical situations such as overlapping beams, focusing, or diffusive scattering in which there is no unique transverse direction. In this dissertation, the correlation matrix formalism is expanded to describe three dimensional electromagnetic fields, providing a definition for the degree of polarization of such a field. It is also shown that, because of the dimensionality of the problem, a second parameter is necessary to fully describe the polarimetric properties of three dimensional fields. Measurements of second-order correlations of a three dimensional field are demonstrated, allowing the determination of both the degree of polarization and the state of polarization. These new theoretical concepts and innovative experimental approaches introduced in thiss dissertation are expected to impact scientific areas as diverse as near field optics, remote sensing, high energy laser physics, fluorescence microscopy, and imaging.
Show less - Date Issued
- 2006
- Identifier
- CFE0000982, ucf:46697
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000982
- Title
- Classifying and Predicting Walking Speed From Electroencephalography Data.
- Creator
-
Rahrooh, Allen, Huang, Helen, Huang, Hsin-Hsiung, Samsam, Mohtashem, University of Central Florida
- Abstract / Description
-
Electroencephalography (EEG) non-invasively records electrocortical activity and can be used to understand how the brain functions to control movements and walking. Studies have shown that electrocortical dynamics are coupled with the gait cycle and change when walking at different speeds. Thus, EEG signals likely contain information regarding walking speed that could potentially be used to predict walking speed using just EEG signals recorded during walking. The purpose of this study was to...
Show moreElectroencephalography (EEG) non-invasively records electrocortical activity and can be used to understand how the brain functions to control movements and walking. Studies have shown that electrocortical dynamics are coupled with the gait cycle and change when walking at different speeds. Thus, EEG signals likely contain information regarding walking speed that could potentially be used to predict walking speed using just EEG signals recorded during walking. The purpose of this study was to determine whether walking speed could be predicted from EEG recorded as subjects walked on a treadmill with a range of speeds (0.5 m/s, 0.75 m/s, 1.0 m/s, 1.25 m/s, and self-paced). We first applied spatial Independent Component Analysis (sICA) to reduce temporal dimensionality and then used current popular classification methods: Bagging, Boosting, Random Forest, Na(&)#239;ve Bayes, Logistic Regression, and Support Vector Machines with a linear and radial basis function kernel. We evaluated the precision, sensitivity, and specificity of each classifier. Logistic regression had the highest overall performance (76.6 +/- 13.9%), and had the highest precision (86.3 +/- 11.7%) and sensitivity (88.7 +/- 8.7%). The Support Vector Machine with a radial basis function kernel had the highest specificity (60.7 +/- 39.1%). These overall performance values are relatively good since the EEG data had only been high-pass filtered with a 1 Hz cutoff frequency and no extensive cleaning methods were performed. All of the classifiers had an overall performance of at least 68% except for the Support Vector Machine with a linear kernel, which had an overall performance of 55.4%. These results suggest that applying spatial Independent Component Analysis to reduce temporal dimensionality of EEG signals does not significantly impair the classification of walking speed using EEG and that walking speeds can be predicted from EEG data.
Show less - Date Issued
- 2019
- Identifier
- CFE0007517, ucf:52642
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007517
- Title
- UGH...STATISTICS! COLLEGE STUDENTS' ATTITUDES AND PERCEPTIONS TOWARD STATISTICS.
- Creator
-
Doyle, Drew A, Brophy-Ellison, James, University of Central Florida
- Abstract / Description
-
Statistics is a course that is required for a majority of undergraduate college students in a wide variety of majors. It is not just required for Statistics or Mathematics majors, but also for those undergraduate college students majoring in Biology, Engineering, Sociology, and countless other majors. It can often be seen as a daunting course, especially for those who feel that mathematics is not their strongest subject. Students begin to dislike the course before even starting and this can...
Show moreStatistics is a course that is required for a majority of undergraduate college students in a wide variety of majors. It is not just required for Statistics or Mathematics majors, but also for those undergraduate college students majoring in Biology, Engineering, Sociology, and countless other majors. It can often be seen as a daunting course, especially for those who feel that mathematics is not their strongest subject. Students begin to dislike the course before even starting and this can carry on throughout the entirety of the course. This thesis will focus primarily on students' perceptions and attitudes toward their statistics courses rather than their performance. Many courses are taught a specific way that is conducive to all learning styles, which may lead to the students not enjoying or understanding their statistics course. The students' learning style may also be correlated to their attitude and perception of statistics. The goal of this thesis is to better understand the college students in order to adapt the current methods so that student can enjoy the course, appreciate the knowledge they learn and its impact on their future career paths.
Show less - Date Issued
- 2017
- Identifier
- CFH2000165, ucf:45988
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFH2000165
- Title
- Applications of Transit Signal Priority Technology for Transit Service.
- Creator
-
Consoli, Frank, Al-Deek, Haitham, Oloufa, Amr, Tatari, Omer, Uddin, Nizam, University of Central Florida
- Abstract / Description
-
This research demonstrated the effectiveness of Transit Signal Priority (TSP) in improving bus corridor travel time in a simulated environment using real world data. TSP is a technology that provides preferential treatment to buses at signalized intersections. By considering different scenarios of activating bus signal priority when a bus is 3 or 5 minutes behind schedule, it was demonstrated that bus travel times improved significantly while there is little effect on delays for crossing...
Show moreThis research demonstrated the effectiveness of Transit Signal Priority (TSP) in improving bus corridor travel time in a simulated environment using real world data. TSP is a technology that provides preferential treatment to buses at signalized intersections. By considering different scenarios of activating bus signal priority when a bus is 3 or 5 minutes behind schedule, it was demonstrated that bus travel times improved significantly while there is little effect on delays for crossing street traffic. The case of providing signal priority for buses unconditionally resulted in significant crossing street delays for some signalized intersections with only minor improvement to bus travel time over both scenarios of Conditional priority.Evaluation was conducted by using micro-simulation and statistical analysis to compare Unconditional and Conditional TSP with the No TSP scenario. This evaluation looked at performance metrics (for buses and all vehicles) including average speed profiles, average travel times, average number of stops, and crossing street delay. Different Conditional TSP scenarios of activating TSP when a bus is 3 or 5 minutes behind schedule were considered. The simulation demonstrated that Conditional TSP significantly improved bus travel times with little effect on crossing street delays. The results also showed that utilizing TSP technology reduced the environmental emissions in the I-Drive corridor. Furthermore, field data was used to calculate actual passenger travel time savings and benefit cost ratio (7.92) that resulted from implementing conditional TSP. Conditional TSP 3 minutes behind schedule was determined to be the most beneficial and practical TSP scenario for real world implementation at both the corridor and regional levels.
Show less - Date Issued
- 2014
- Identifier
- CFE0005474, ucf:50343
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005474
- Title
- Selective Multivariate Applications in Forensic Science.
- Creator
-
Rinke, Caitlin, Sigman, Michael, Campiglia, Andres, Yestrebsky, Cherie, Kuebler, Stephen, Richardson, Martin, University of Central Florida
- Abstract / Description
-
A 2009 report published by the National Research Council addressed the need for improvements in the field of forensic science. In the report emphasis was placed on the need for more rigorous scientific analysis within many forensic science disciplines and for established limitations and determination of error rates from statistical analysis. This research focused on multivariate statistical techniques for the analysis of spectral data obtained for multiple forensic applications which include...
Show moreA 2009 report published by the National Research Council addressed the need for improvements in the field of forensic science. In the report emphasis was placed on the need for more rigorous scientific analysis within many forensic science disciplines and for established limitations and determination of error rates from statistical analysis. This research focused on multivariate statistical techniques for the analysis of spectral data obtained for multiple forensic applications which include samples from: automobile float glasses and paints, bones, metal transfers, ignitable liquids and fire debris, and organic compounds including explosives. The statistical techniques were used for two types of data analysis: classification and discrimination. Statistical methods including linear discriminant analysis and a novel soft classification method were used to provide classification of forensic samples based on a compiled library. The novel soft classification method combined three statistical steps: Principal Component Analysis (PCA), Target Factor Analysis (TFA), and Bayesian Decision Theory (BDT) to provide classification based on posterior probabilities of class membership. The posterior probabilities provide a statistical probability of classification which can aid a forensic analyst in reaching a conclusion. The second analytical approach applied nonparametric methods to provide the means for discrimination between samples. Nonparametric methods are performed as hypothesis test and do not assume normal distribution of the analytical figures of merit. The nonparametric permutation test was applied to forensic applications to determine the similarity between two samples and provide discrimination rates. Both the classification method and discrimination method were applied to data acquired from multiple instrumental methods. The instrumental methods included: Laser Induced-Breakdown Spectroscopy (LIBS), Fourier Transform Infrared Spectroscopy (FTIR), Raman spectroscopy, and Gas Chromatography-Mass Spectrometry (GC-MS). Some of these instrumental methods are currently applied to forensic applications, such as GC-MS for the analysis of ignitable liquid and fire debris samples; while others provide new instrumental methods to areas within forensic science which currently lack instrumental analysis techniques, such as LIBS for the analysis of metal transfers. The combination of the instrumental techniques and multivariate statistical techniques is investigated in new approaches to forensic applications in this research to assist in improving the field of forensic science.
Show less - Date Issued
- 2012
- Identifier
- CFE0004628, ucf:49942
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004628
- Title
- Mahalanobis kernel-based support vector data description for detection of large shifts in mean vector.
- Creator
-
Nguyen, Vu, Maboudou, Edgard, Nickerson, David, Schott, James, University of Central Florida
- Abstract / Description
-
Statistical process control (SPC) applies the science of statistics to various process control in order to provide higher-quality products and better services. The K chart is one among the many important tools that SPC offers. Creation of the K chart is based on Support Vector Data Description (SVDD), a popular data classifier method inspired by Support Vector Machine (SVM). As any methods associated with SVM, SVDD benefits from a wide variety of choices of kernel, which determines the...
Show moreStatistical process control (SPC) applies the science of statistics to various process control in order to provide higher-quality products and better services. The K chart is one among the many important tools that SPC offers. Creation of the K chart is based on Support Vector Data Description (SVDD), a popular data classifier method inspired by Support Vector Machine (SVM). As any methods associated with SVM, SVDD benefits from a wide variety of choices of kernel, which determines the effectiveness of the whole model. Among the most popular choices is the Euclidean distance-based Gaussian kernel, which enables SVDD to obtain a flexible data description, thus enhances its overall predictive capability. This thesis explores an even more robust approach by incorporating the Mahalanobis distance-based kernel (hereinafter referred to as Mahalanobis kernel) to SVDD and compare it with SVDD using the traditional Gaussian kernel. Method's sensitivity is benchmarked by Average Run Lengths obtained from multiple Monte Carlo simulations. Data of such simulations are generated from multivariate normal, multivariate Student's (t), and multivariate gamma populations using R, a popular software environment for statistical computing. One case study is also discussed using a real data set received from Halberg Chronobiology Center. Compared to Gaussian kernel, Mahalanobis kernel makes SVDD and thus the K chart significantly more sensitive to shifts in mean vector, and also in covariance matrix.
Show less - Date Issued
- 2015
- Identifier
- CFE0005676, ucf:50170
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005676
- Title
- ON THE USE OF VARIABLE COHERENCE IN INVERSE SCATTERING PROBLEMS.
- Creator
-
Baleine, Erwan, Dogariu, Aristide, University of Central Florida
- Abstract / Description
-
Even though most of the properties of optical fields, such as wavelength, polarization, wavefront curvature or angular spectrum, have been commonly manipulated in a variety of remote sensing procedures, controlling the degree of coherence of light did not find wide applications until recently. Since the emergence of optical coherence tomography, a growing number of scattering techniques have relied on temporal coherence gating which provides efficient target selectivity in a way achieved only...
Show moreEven though most of the properties of optical fields, such as wavelength, polarization, wavefront curvature or angular spectrum, have been commonly manipulated in a variety of remote sensing procedures, controlling the degree of coherence of light did not find wide applications until recently. Since the emergence of optical coherence tomography, a growing number of scattering techniques have relied on temporal coherence gating which provides efficient target selectivity in a way achieved only by bulky short pulse measurements. The spatial counterpart of temporal coherence, however, has barely been exploited in sensing applications. This dissertation examines, in different scattering regimes, a variety of inverse scattering problems based on variable spatial coherence gating. Within the framework of the radiative transfer theory, this dissertation demonstrates that the short range correlation properties of a medium under test can be recovered by varying the size of the coherence volume of an illuminating beam. Nonetheless, the radiative transfer formalism does not account for long range correlations and current methods for retrieving the correlation function of the complex susceptibility require cumbersome cross-spectral density measurements. Instead, a variable coherence tomographic procedure is proposed where spatial coherence gating is used to probe the structural properties of single scattering media over an extended volume and with a very simple detection system. Enhanced backscattering is a coherent phenomenon that survives strong multiple scattering. The variable coherence tomography approach is extended in this context to diffusive media and it is demonstrated that specific photon trajectories can be selected in order to achieve depth-resolved sensing. Probing the scattering properties of shallow and deeper layers is of considerable interest in biological applications such as diagnosis of skin related diseases. The spatial coherence properties of an illuminating field can be manipulated over dimensions much larger than the wavelength thus providing a large effective sensing area. This is a practical advantage over many near-field microscopic techniques, which offer a spatial resolution beyond the classical diffraction limit but, at the expense of scanning a probe over a large area of a sample which is time consuming, and, sometimes, practically impossible. Taking advantage of the large field of view accessible when using the spatial coherence gating, this dissertation introduces the principle of variable coherence scattering microscopy. In this approach, a subwavelength resolution is achieved from simple far-zone intensity measurements by shaping the degree of spatial coherence of an evanescent field. Furthermore, tomographic techniques based on spatial coherence gating are especially attractive because they rely on simple detection schemes which, in principle, do not require any optical elements such as lenses. To demonstrate this capability, a correlated lensless imaging method is proposed and implemented, where both amplitude and phase information of an object are obtained by varying the degree of spatial coherence of the incident beam. Finally, it should be noted that the idea of using the spatial coherence properties of fields in a tomographic procedure is applicable to any type of electromagnetic radiation. Operating on principles of statistical optics, these sensing procedures can become alternatives for various target detection schemes, cutting-edge microscopies or x-ray imaging methods.
Show less - Date Issued
- 2006
- Identifier
- CFE0001387, ucf:47005
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001387
- Title
- The Effect of Precipitation on the Spread of Mosquito-Borne Diseases: A Case Study of Florida Counties.
- Creator
-
Osbourne, Marvin, Mohapatra, Ram, Shuai, Zhisheng, Kincaid, John, University of Central Florida
- Abstract / Description
-
The state of Florida is the third most populous state in the United States of America, with six (6) of its metropolitan areas dubbed as the fastest growing in the entire country. A mosquito bite may mean the transmission of a virus or disease which might be fatal. Hence, there is a need for the state to control mosquitoes through the various Departments of Mosquito Control in each of its sixty-seven (67) counties. Six locally acquired mosquito-borne viruses which affect humans and animals in...
Show moreThe state of Florida is the third most populous state in the United States of America, with six (6) of its metropolitan areas dubbed as the fastest growing in the entire country. A mosquito bite may mean the transmission of a virus or disease which might be fatal. Hence, there is a need for the state to control mosquitoes through the various Departments of Mosquito Control in each of its sixty-seven (67) counties. Six locally acquired mosquito-borne viruses which affect humans and animals in the state of Florida were considered. This thesis used statistical methods to examine data for rainfall, population estimate, as well as, the data on six (6) arboviruses, over the course of thirteen (13) years, namely 2002 to 2014. The first hypothesis that was tested, was that greater precipitation increased the likelihood of a greater number of arbovirus cases. It was important to also examine the relationship that this growing human population had with mosquito-borne diseases, and so the second hypothesis that was tested, was that, an increase in the human population would increase the likelihood of a greater number of arbovirus cases. Subsequently, an analysis was done for eleven (11) of Florida's 67 counties with the greatest cumulative occurrence of human and animal arbovirus cases combined. Of the eleven counties, seven exhibited a weak associated between the size of the human population and the spread of animal and human arbovirus cases; three exhibited a somewhat moderate association; and one (-) Osceola County (-) had a strong negative association. This indicated that, as the size of the human population increased in Osceola County, the combined number of human and animal arbovirus cases decreased, which refuted the second hypothesis of this thesis. A linear regression model for the data for Osceola County was derived and that model was used to simulate what will occur in future years with the use of population projection data. In each simulated year, the number of combined human and arbovirus cases was negative. This prediction meant that, as the projected population increased from year to year, then the number of cases should be zero in each year. The reliability of these predictions are questionable, since Osceola County does not exist in a vacuum and it cannot be isolated from the surrounding counties which may be experiencing an outbreak of arboviruses.
Show less - Date Issued
- 2015
- Identifier
- CFE0005859, ucf:50926
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005859
- Title
- HJB Equation and Statistical Arbitrage applied to High Frequency Trading.
- Creator
-
Park, Yonggi, Yong, Jiongmin, Swanson, Jason, Richardson, Gary, Shuai, Zhisheng, University of Central Florida
- Abstract / Description
-
In this thesis we investigate some properties of market making and statistical arbitrage applied to High Frequency Trading (HFT). Using the Hamilton-Jacobi-Bellman(HJB) model developed by Guilbaud, Fabien and Pham, Huyen in 2012, we studied how market making works to obtain optimal strategy during limit order and market order. Also we develop the best investment strategy through Moving Average, Exponential Moving Average, Relative Strength Index, Sharpe Ratio.
- Date Issued
- 2013
- Identifier
- CFE0004907, ucf:49628
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004907
- Title
- FADE STATISTICS FOR A LASERCOM SYSTEM AND THE JOINT PDF OF A GAMMA-GAMMA DISTRIBUTED IRRADIANCE AND ITS TIME DERIVATIVE.
- Creator
-
Stromqvist Vetelino, Frida, Young, Cynthia, University of Central Florida
- Abstract / Description
-
The performance of lasercom systems operating in the atmosphere is reduced by optical turbulence, which causes irradiance fluctuations in the received signal. The result is a randomly fading signal. Fade statistics for lasercom systems are determined from the probability density function (PDF) of the irradiance fluctuations. The expected number of fades per second and their mean fade time require the joint PDF of the fluctuating irradiance and its time derivative. Theoretical integral...
Show moreThe performance of lasercom systems operating in the atmosphere is reduced by optical turbulence, which causes irradiance fluctuations in the received signal. The result is a randomly fading signal. Fade statistics for lasercom systems are determined from the probability density function (PDF) of the irradiance fluctuations. The expected number of fades per second and their mean fade time require the joint PDF of the fluctuating irradiance and its time derivative. Theoretical integral expressions, as well as closed form, analytical approximations, were developed for the joint PDF of a gamma-gamma distributed irradiance and its time derivative, and the corresponding expression for the expected number of fades per second. The new approximation for the conditional PDF of the time derivative of a gamma-gamma irradiance is a zero mean Gaussian distribution, with a complicated irradiance depending variance. Fade statistics obtained from experimental data were compared to theoretical predictions based on the lognormal and gamma-gamma distributions. A Gaussian beam wave was propagated through the atmosphere along a horizontal path, near ground, in the moderate-to-strong optical turbulence. To characterize the propagation path, a new method that infers atmospheric propagation parameters was developed. Scintillation theory combined with a numerical scheme was used to infer the structure constant, Cn2, the inner scale and the outer scale from the optical measurements. The inferred parameters were used in calculations for the theoretical PDFs. It was found that fade predictions made by the gamma-gamma and lognormal distributions provide an upper and lower bound, respectively, for the probability of fade and the number of fades per second for irradiance data collected in the moderate-to-strong fluctuation regime. Aperture averaging effects on the PDF of the irradiance fluctuations were investigated by comparing the irradiance distributions for the three receiver apertures at two different values of the structure parameter and, hence, different values of the coherence radius. For the moderate-to-strong fluctuation regime, the gamma-gamma distribution provides a good fit to the irradiance fluctuations collected by finite-sized apertures that are significantly smaller than the coherence radius. For apertures larger than or equal to the coherence radius, the irradiance fluctuations appear to be lognormally distributed.
Show less - Date Issued
- 2006
- Identifier
- CFE0001440, ucf:47069
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001440
- Title
- COHERENCE PROPERTIES OF OPTICAL NEAR-FIELDS.
- Creator
-
Apostol, Adela, Dogariu, Aristide, University of Central Florida
- Abstract / Description
-
Next generation photonics-based technologies will ultimately rely on novel materials and devices. For this purpose, phenomena at subwavelength scales are being studied to advance both fundamental knowledge and experimental capabilities. In this dissertation, concepts specific to near-field optics and experimental capabilities specific to near-field microscopy are used to investigate various aspects of the statistical properties of random electromagnetic fields in the vicinity of optically...
Show moreNext generation photonics-based technologies will ultimately rely on novel materials and devices. For this purpose, phenomena at subwavelength scales are being studied to advance both fundamental knowledge and experimental capabilities. In this dissertation, concepts specific to near-field optics and experimental capabilities specific to near-field microscopy are used to investigate various aspects of the statistical properties of random electromagnetic fields in the vicinity of optically inhomogeneous media which emit or scatter radiation. The properties of such fields are being characterized within the frame of the coherence theory. While successful in describing the far-field properties of optical fields, the fundamental results of the conventional coherence theory disregard the contribution of short-range evanescent waves. Nonetheless, the specific features of random fields at subwavelength distances from interfaces of real media are influenced by the presence of evanescent waves because, in this case, both propagating and nonpropagating components contribute to the detectable properties of the radiation. In our studies, we have fully accounted for both contributions and, as a result, different surface and subsurface characteristics of inhomogeneous media could be explored. We investigated different properties of random optical near-fields which exhibit either Gaussian or non-Gaussian statistics. We have demonstrated that characteristics of optical radiation such as first- and second-order statistics of intensity and the spectral density in the vicinity of random media are all determined by both evanescent waves contribution and the statistical properties of the physical interface. For instance, we quantified the subtle differences which exist between the near- and far-field spectra of radiation and we brought the first experimental evidence that, contrary to the predictions of the conventional coherence theory, the values of coherence length in the near field depend on the distance from the interface and, moreover, they can be smaller than the wavelength of light. The results included in this dissertation demonstrate that the statistical properties of the electromagnetic fields which exist in the close proximity of inhomogeneous media can be used to extract structural information. They also suggest the possibility to adjust the coherence properties of the emitted radiation by modifying the statistical properties of the interfaces. Understanding the random interference phenomena in the near-field could also lead to new possibilities for surface and subsurface diagnostics of inhomogeneous media. In addition, controlling the statistical properties of radiation at subwavelength scales should be of paramount importance in the design of miniaturized optical sources, detectors and sensors.
Show less - Date Issued
- 2005
- Identifier
- CFE0000408, ucf:46410
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000408
- Title
- DERIVING THE DENSITY OF STATES FOR GRANULAR CONTACT FORCES.
- Creator
-
Metzger, Philip, Bhattacharya, Aniket, University of Central Florida
- Abstract / Description
-
The density of single grain states in static granular packings is derived from first principles for an idealized yet fundamental case. This produces the distribution of contact forces P_f(f) in the packing. Because there has been some controversy in the published literature over the exact form of the distribution, this dissertation begins by reviewing the existing empirical observations to resolve those controversies. A method is then developed to analyze Edwards' granular contact force...
Show moreThe density of single grain states in static granular packings is derived from first principles for an idealized yet fundamental case. This produces the distribution of contact forces P_f(f) in the packing. Because there has been some controversy in the published literature over the exact form of the distribution, this dissertation begins by reviewing the existing empirical observations to resolve those controversies. A method is then developed to analyze Edwards' granular contact force probability functional from first principles. The derivation assumes Edwards' flat measure -- a density of states (DOS) that is uniform within the metastable regions of phase space. A further assumption, supported by physical arguments and empirical evidence, is that contact force correlations arising through the closure of loops of grains may be neglected. Then, maximizing a state-counting entropy results in a transport equation that can be solved numerically. For the present it has been solved using the "Mean Structure Approximation," projecting the DOS across all angular coordinates to more clearly identify its predominant features in the remaining stress coordinates. These features are: (1) the Grain Factor related to grain stability and strong correlation between the contact forces on the same grain, and (2) the Structure Factor related to Newton's third law and strong correlation between neighboring grains. Numerical simulations were then performed for idealized granular packings to permit a direct comparison with the theory, and the data including P_f(f) were found to be in excellent agreement. Where the simulations and theory disagree, it is primarily due to the coordination number Z because the theory assumes Z to be a constant whereas in disordered packings it is not. The form of the empirical DOS is discovered to have an elegant, underlying pattern related to Z. This pattern consists entirely of the functional forms correctly predicted by the theory, but with only slight parameter changes as a function of Z. This produces significant physical insight and suggests how the theory may be generalized in the future.
Show less - Date Issued
- 2005
- Identifier
- CFE0000381, ucf:46325
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000381
- Title
- A STUDY OF EQUATORIAL IONOPSHERIC VARIABILITY USING SIGNAL PROCESSING TECHNIQUES.
- Creator
-
wang, xiaoni, Eastes, Richard, University of Central Florida
- Abstract / Description
-
The dependence of equatorial ionosphere on solar irradiances and geomagnetic activity are studied in this dissertation using signal processing techniques. The statistical time series, digital signal processing and wavelet methods are applied to study the ionospheric variations. The ionospheric data used are the Total Electron Content (TEC) and the critical frequency of the F2 layer (foF2). Solar irradiance data are from recent satellites, the Student Nitric Oxide Explorer (SNOE) satellite and...
Show moreThe dependence of equatorial ionosphere on solar irradiances and geomagnetic activity are studied in this dissertation using signal processing techniques. The statistical time series, digital signal processing and wavelet methods are applied to study the ionospheric variations. The ionospheric data used are the Total Electron Content (TEC) and the critical frequency of the F2 layer (foF2). Solar irradiance data are from recent satellites, the Student Nitric Oxide Explorer (SNOE) satellite and the Thermosphere Ionosphere Mesosphere Energetics Dynamics (TIMED) satellite. The Disturbance Storm-Time (Dst) index is used as a proxy of geomagnetic activity in the equatorial region. The results are summarized as follows. (1) In the short-term variations < 27-days, the previous three days solar irradiances have significant correlation with the present day ionospheric data using TEC, which may contribute 18% of the total variations in the TEC. The 3-day delay between solar irradiances and TEC suggests the effects of neutral densities on the ionosphere. The correlations between solar irradiances and TEC are significantly higher than those using the F10.7 flux, a conventional proxy for short wavelength band of solar irradiances. (2) For variations < 27 days, solar soft X-rays show similar or higher correlations with the ionosphere electron densities than the Extreme Ultraviolet (EUV). The correlations between solar irradiances and foF2 decrease from morning (0.5) to the afternoon (0.1). (3) Geomagnetic activity plays an important role in the ionosphere in short-term variations < 10 days. The average correlation between TEC and Dst is 0.4 at 2-3, 3-5, 5-9 and 9-11 day scales, which is higher than those between foF2 and Dst. The correlations between TEC and Dst increase from morning to afternoon. The moderate/quiet geomagnetic activity plays a distinct role in these short-term variations of the ionosphere (~0.3 correlation).
Show less - Date Issued
- 2007
- Identifier
- CFE0001602, ucf:47188
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001602
- Title
- SYSTEM DESIGN AND OPTIMIZATION OF OPTICAL COHERENCE TOMOGRAPHY.
- Creator
-
Akcay, Avni, Rolland, Jannick, University of Central Florida
- Abstract / Description
-
Optical coherence imaging, including tomography (OCT) and microscopy (OCM), has been a growing research field in biomedical optical imaging in the last decade. In this imaging modality, a broadband light source, thus of short temporal coherence length, is used to perform imaging via interferometry. A challenge in optical coherence imaging, as in any imaging system towards biomedical diagnosis, is the quantification of image quality and optimization of the system components, both a primary...
Show moreOptical coherence imaging, including tomography (OCT) and microscopy (OCM), has been a growing research field in biomedical optical imaging in the last decade. In this imaging modality, a broadband light source, thus of short temporal coherence length, is used to perform imaging via interferometry. A challenge in optical coherence imaging, as in any imaging system towards biomedical diagnosis, is the quantification of image quality and optimization of the system components, both a primary focus of this research. We concentrated our efforts on the optimization of the imaging system from two main standpoints: axial point spread function (PSF) and practical steps towards compact low-cost solutions. Up to recently, the criteria for the quality of a system was based on speed of imaging, sensitivity, and particularly axial resolution estimated solely from the full-width at half-maximum (FWHM) of the axial PSF with the common practice of assuming a Gaussian source power spectrum. As part of our work to quantify axial resolution we first brought forth two more metrics unlike FWHM, which accounted for side lobes in the axial PSF caused by irregularities in the shape of the source power spectrum, such as spectral dips. Subsequently, we presented a method where the axial PSF was significantly optimized by suppressing the side lobes occurring because of the irregular shape of the source power spectrum. The optimization was performed through optically shaping the source power spectrum via a programmable spectral shaper, which consequentially led to suppression of spurious structures in the images of a layered specimen. The superiority of the demonstrated approach was in performing reshaping before imaging, thus eliminating the need for post-data acquisition digital signal processing. Importantly, towards the optimization and objective image quality assessment in optical coherence imaging, the impact of source spectral shaping was further analyzed in a task-based assessment method based on statistical decision theory. Two classification tasks, a signal-detection task and a resolution task, were investigated. Results showed that reshaping the source power spectrum was a benefit essentially to the resolution task, as opposed to both the detection and resolution tasks, and the importance of the specimen local variations in index of refraction on the resolution task was demonstrated. Finally, towards the optimization of OCT and OCM for use in clinical settings, we analyzed the detection electronics stage, which is a crucial component of the system that is designed to capture extremely weak interferometric signals in biomedical and biological imaging applications. We designed and tested detection electronics to achieve a compact and low-cost solution for portable imaging units and demonstrated that the design provided an equivalent performance to the commercial lock-in amplifier considering the system sensitivity obtained with both detection schemes.
Show less - Date Issued
- 2005
- Identifier
- CFE0000651, ucf:46527
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000651
- Title
- ANALYZING THE COMMUNITY STRUCTURE OF WEB-LIKE NETWORKS: MODELS AND ALGORITHMS.
- Creator
-
Cami, Aurel, Deo, Narsingh, University of Central Florida
- Abstract / Description
-
This dissertation investigates the community structure of web-like networks (i.e., large, random, real-life networks such as the World Wide Web and the Internet). Recently, it has been shown that many such networks have a locally dense and globally sparse structure with certain small, dense subgraphs occurring much more frequently than they do in the classical Erdös-Rényi random graphs. This peculiarity--which is commonly referred to as community structure--has been observed in...
Show moreThis dissertation investigates the community structure of web-like networks (i.e., large, random, real-life networks such as the World Wide Web and the Internet). Recently, it has been shown that many such networks have a locally dense and globally sparse structure with certain small, dense subgraphs occurring much more frequently than they do in the classical Erdös-Rényi random graphs. This peculiarity--which is commonly referred to as community structure--has been observed in seemingly unrelated networks such as the Web, email networks, citation networks, biological networks, etc. The pervasiveness of this phenomenon has led many researchers to believe that such cohesive groups of nodes might represent meaningful entities. For example, in the Web such tightly-knit groups of nodes might represent pages with a common topic, geographical location, etc., while in the neural networks they might represent evolved computational units. The notion of community has emerged in an effort to formalize the empirical observation of the locally dense globally sparse structure of web-like networks. In the broadest sense, a community in a web-like network is defined as a group of nodes that induces a dense subgraph which is sparsely linked with the rest of the network. Due to a wide array of envisioned applications, ranging from crawlers and search engines to network security and network compression, there has recently been a widespread interest in finding efficient community-mining algorithms. In this dissertation, the community structure of web-like networks is investigated by a combination of analytical and computational techniques: First, we consider the problem of modeling the web-like networks. In the recent years, many new random graph models have been proposed to account for some recently discovered properties of web-like networks that distinguish them from the classical random graphs. The vast majority of these random graph models take into account only the addition of new nodes and edges. Yet, several empirical observations indicate that deletion of nodes and edges occurs frequently in web-like networks. Inspired by such observations, we propose and analyze two dynamic random graph models that combine node and edge addition with a uniform and a preferential deletion of nodes, respectively. In both cases, we find that the random graphs generated by such models follow power-law degree distributions (in agreement with the degree distribution of many web-like networks). Second, we analyze the expected density of certain small subgraphs--such as defensive alliances on three and four nodes--in various random graphs models. Our findings show that while in the binomial random graph the expected density of such subgraphs is very close to zero, in some dynamic random graph models it is much larger. These findings converge with our results obtained by computing the number of communities in some Web crawls. Next, we investigate the computational complexity of the community-mining problem under various definitions of community. Assuming the definition of community as a global defensive alliance, or a global offensive alliance we prove--using transformations from the dominating set problem--that finding optimal communities is an NP-complete problem. These and other similar complexity results coupled with the fact that many web-like networks are huge, indicate that it is unlikely that fast, exact sequential algorithms for mining communities may be found. To handle this difficulty we adopt an algorithmic definition of community and a simpler version of the community-mining problem, namely: find the largest community to which a given set of seed nodes belong. We propose several greedy algorithms for this problem: The first proposed algorithm starts out with a set of seed nodes--the initial community--and then repeatedly selects some nodes from community's neighborhood and pulls them in the community. In each step, the algorithm uses clustering coefficient--a parameter that measures the fraction of the neighbors of a node that are neighbors themselves--to decide which nodes from the neighborhood should be pulled in the community. This algorithm has time complexity of order , where denotes the number of nodes visited by the algorithm and is the maximum degree encountered. Thus, assuming a power-law degree distribution this algorithm is expected to run in near-linear time. The proposed algorithm achieved good accuracy when tested on some real and computer-generated networks: The fraction of community nodes classified correctly is generally above 80% and often above 90% . A second algorithm based on a generalized clustering coefficient, where not only the first neighborhood is taken into account but also the second, the third, etc., is also proposed. This algorithm achieves a better accuracy than the first one but also runs slower. Finally, a randomized version of the second algorithm which improves the time complexity without affecting the accuracy significantly, is proposed. The main target application of the proposed algorithms is focused crawling--the selective search for web pages that are relevant to a pre-defined topic.
Show less - Date Issued
- 2005
- Identifier
- CFE0000900, ucf:46726
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000900
- Title
- INVESTIGATION OF DAMAGE DETECTION METHODOLOGIES FOR STRUCTURAL HEALTH MONITORING.
- Creator
-
Gul, Mustafa, Catbas, F. Necati, University of Central Florida
- Abstract / Description
-
Structural Health Monitoring (SHM) is employed to track and evaluate damage and deterioration during regular operation as well as after extreme events for aerospace, mechanical and civil structures. A complete SHM system incorporates performance metrics, sensing, signal processing, data analysis, transmission and management for decision-making purposes. Damage detection in the context of SHM can be successful by employing a collection of robust and practical damage detection methodologies...
Show moreStructural Health Monitoring (SHM) is employed to track and evaluate damage and deterioration during regular operation as well as after extreme events for aerospace, mechanical and civil structures. A complete SHM system incorporates performance metrics, sensing, signal processing, data analysis, transmission and management for decision-making purposes. Damage detection in the context of SHM can be successful by employing a collection of robust and practical damage detection methodologies that can be used to identify, locate and quantify damage or, in general terms, changes in observable behavior. In this study, different damage detection methods are investigated for global condition assessment of structures. First, different parametric and non-parametric approaches are re-visited and further improved for damage detection using vibration data. Modal flexibility, modal curvature and un-scaled flexibility based on the dynamic properties that are obtained using Complex Mode Indicator Function (CMIF) are used as parametric damage features. Second, statistical pattern recognition approaches using time series modeling in conjunction with outlier detection are investigated as a non-parametric damage detection technique. Third, a novel methodology using ARX models (Auto-Regressive models with eXogenous output) is proposed for damage identification. By using this new methodology, it is shown that damage can be detected, located and quantified without the need of external loading information. Next, laboratory studies are conducted on different test structures with a number of different damage scenarios for the evaluation of the techniques in a comparative fashion. Finally, application of the methodologies to real life data is also presented along with the capabilities and limitations of each approach in light of analysis results of the laboratory and real life data.
Show less - Date Issued
- 2009
- Identifier
- CFE0002830, ucf:48069
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0002830
- Title
- Community College Leadership: The Pathways, Competencies, and Preparation of Presidents and Chief Academic Officers.
- Creator
-
Minton, Richard, King, Kathy (Kathleen), Cox, Thomas, Marshall, Nancy, Witta, Eleanor, University of Central Florida
- Abstract / Description
-
At the beginning of the new millennium, concerns were raised that a leadership crisis was soon to develop due to a high percentage of community college presidents and chief academic officers (CAOs) approaching retirement within the decade. With concerns that there would not be a sufficient number of leaders ready to assume these roles, the American Association of Community Colleges (AACC) developed a list of six competencies essential to community college leadership (AACC, 2005). The purpose...
Show moreAt the beginning of the new millennium, concerns were raised that a leadership crisis was soon to develop due to a high percentage of community college presidents and chief academic officers (CAOs) approaching retirement within the decade. With concerns that there would not be a sufficient number of leaders ready to assume these roles, the American Association of Community Colleges (AACC) developed a list of six competencies essential to community college leadership (AACC, 2005). The purpose of this study was to examine the pathways, competencies, and preparation of community college presidents and CAOs. Leaders in those positions at two-year colleges in eight southeastern states were surveyed in August-September 2017. Demographic data was collected to determine common career pathways and it was found that an overwhelming majority of current respondents earned doctorate degrees and that many of them had focused their advanced degrees in the areas of education and/or leadership. Approximately 84% of the leaders who responded expected to retire within 10 years of the study. Also, at least 50% of the presidents who responded followed an academic pathway to the presidency. Respondents were asked to rate the extent to which they agreed that the AACC competencies were essential to their leadership roles and the extent to which they agreed that they had been prepared for each competency prior to assuming their current roles. The results indicated high levels of agreement that all six competencies were essential; however, tests did reveal statistically significant differences between the levels of agreement, namely that one competency -- community college advocacy (-) had a lower level of agreement than the other five competencies. Respondents also indicated that they had been adequately prepared for each competency prior to assuming their current roles, with on-the-job experiences being the most common method of preparation for the competencies. A correlation analysis revealed that there was a positive relationship between the extent to which leaders agreed that the competencies were essential and the extent to which they agreed that they were prepared for the competencies. There were also no statistical differences between presidents and CAOs on the preparation ratings for each competency and there was only a difference in the essential ratings for the competency of collaboration. Recommendations for future practice based on the leadership frameworks of Bolman and Deal (2013) and Nevarez, Wood, and Penrose (2013) are provided, along with recommendations for higher educational leadership doctoral programs and future research regarding pathways, competencies, and preparation.
Show less - Date Issued
- 2018
- Identifier
- CFE0007054, ucf:52014
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007054
- Title
- Mesoscopic Interactions in Complex Photonic Media.
- Creator
-
Rezvani Naraghi, Roxana, Dogariu, Aristide, Tetard, Laurene, Rahman, Talat, Abouraddy, Ayman, University of Central Florida
- Abstract / Description
-
Mesoscale optics provides a framework for understanding a wide range of phenomena occurring in a variety of fields ranging from biological tissues to composite materials and from colloidal physics to fabricated nanostructures. When light interacts with a complex system, the outcome depends significantly on the length and time scales of interaction. Mesoscale optics offers the apparatus necessary for describing specific manifestations of wave phenomena such as interference and phase memory in...
Show moreMesoscale optics provides a framework for understanding a wide range of phenomena occurring in a variety of fields ranging from biological tissues to composite materials and from colloidal physics to fabricated nanostructures. When light interacts with a complex system, the outcome depends significantly on the length and time scales of interaction. Mesoscale optics offers the apparatus necessary for describing specific manifestations of wave phenomena such as interference and phase memory in complex media. In-depth understanding of mesoscale phenomena provides the required quantitative explanations that neither microscopic nor macroscopic models of light-matter interaction can afford. Modeling mesoscopic systems is challenging because the outcome properties can be efficiently modified by controlling the extent and the duration of interactions.In this dissertation, we will first present a brief survey of fundamental concepts, approaches, and techniques specific to fundamental light-matter interaction at mesoscopic scales. Then, we will discuss different regimes of light propagation through randomly inhomogenous media. In particular, a novel description will be introduced to analyze specific aspects of light propagation in dense composites. Moreover, we will present evidence that the wave nature of light can be critical for understanding its propagation in unbounded highly scattering materials. We will show that the perceived diffusion of light is subjected to competing mechanisms of interaction that lead to qualitatively different phases for the light evolution through complex media. In particular, we will discuss implications on the ever elusive localization of light in three-dimensional random media. In addition to fundamental aspects of light-matter interaction at mesoscopic scales, this dissertation will also address the process of designing material structures that provide unique scattering properties. We will demonstrate that multi-material dielectric particles with controlled radial and azimuthal structure can be engineered to modify the extinction cross-section, to control the scattering directivity, and to provide polarization-dependent scattering. We will show that dielectric core-shell structures with similar macroscopic sizes can have both high scattering cross-sections and radically different scattering phase functions. In addition, specific structural design, which breaks the azimuthal symmetry of the spherical particle, can be implemented to control the polarization properties of scattered radiation. Moreover, we will also demonstrate that the power flow around mesoscopic scattering particles can be controlled by modifying their internal heterogeneous structures.Lastly, we will show how the statistical properties of the radiation emerging from mesoscopic systems can be utilized for surface and subsurface diagnostics. In this dissertation, we will demonstrate that the intensity distributions measured in the near-field of composite materials are direct signatures of the scale-dependent morphology, which is determined by variations of the local dielectric function. We will also prove that measuring the extent of spatial coherence in the proximity of two-dimensional interfaces constitutes a rather general method for characterizing the defect density in crystalline materials. Finally, we will show that adjusting the spatial coherence properties of radiation can provide a simple solution for a significant deficiency of near-field microscopy. We will demonstrate experimentally that spurious interference effects can be efficiently eliminated in passive near-field imaging by implementing a random illumination.
Show less - Date Issued
- 2017
- Identifier
- CFE0006647, ucf:51253
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006647