Current Search: Statistics (x)
View All Items
Pages
- Title
- The Comparison of the School District Curriculum Alignment with Algebra Content Standards.
- Creator
-
Lipscomb, Karen, Murray, Barbara, Doherty, Walter, Baldwin, Lee, Pawlas, George, University of Central Florida
- Abstract / Description
-
The purpose of this study was to analyze school district curriculum alignment with state and national standards to find content omissions that may contribute to low Algebra End-of-Course exam scores in ninth grade. The study primarily looked for algebra course content omissions in the algebra, functions, and statistics' domains of the algebra curriculum. These three categories were chosen because low achievement for ninth grades students was recorded in each category for a Medium Sized Rural...
Show moreThe purpose of this study was to analyze school district curriculum alignment with state and national standards to find content omissions that may contribute to low Algebra End-of-Course exam scores in ninth grade. The study primarily looked for algebra course content omissions in the algebra, functions, and statistics' domains of the algebra curriculum. These three categories were chosen because low achievement for ninth grades students was recorded in each category for a Medium Sized Rural School District. The study also examined the pre-algebra curriculum for a Medium Sized Rural School District to see if alignment was present with the algebra curriculum. Embedded skills needed for algebra success were also recorded to develop an in-depth look at the curriculum alignment. The embedded skills are skills that should be mastered before students are placed in the pre-algebra course.The algebra state standards were compared with the Medium Sized Rural School District local algebra standards. From the local standards, 95 coded algebra skills were established as pertinent for mastery of algebra content. The 95 coded algebra skills were used in the constant comparison document analysis to find content omissions in the algebra curriculum, the pre-algebra curriculum, and the algebra textbook. The 95 algebra coded skills were also examined individually to record embedded skills needed for mastery of each skill. An additional study was performed on the amount of time given to the mastery of the 95 algebra coded skills or performance tasks.The following results were found in this research for curriculum alignment. In a Medium Size Rural School District, the algebra curriculum and algebra textbook were analyzed for the presence of 95 essential performance tasks in search for missing content. The algebra curriculum and algebra textbook were both found to be aligned with the algebra state standards. These findings allow educators to look at other factors that may contribute to low performance on the Algebra End-of-Course exam. Content omissions were found in the pre-algebra curriculum that showed a lack of alignment with the algebra course. Also, 77 embedded skills were recorded as prerequisites to algebra mastery. Last, the amount of material to be mastered in a ninth grade algebra course may be too numerous for ninth grade algebra students to master the material.
Show less - Date Issued
- 2016
- Identifier
- CFE0006348, ucf:51574
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006348
- Title
- Model Selection via Racing.
- Creator
-
Zhang, Tiantian, Georgiopoulos, Michael, Anagnostopoulos, Georgios, Wu, Annie, Hu, Haiyan, Nickerson, David, University of Central Florida
- Abstract / Description
-
Model Selection (MS) is an important aspect of machine learning, as necessitated by the No Free Lunch theorem. Briefly speaking, the task of MS is to identify a subset of models that are optimal in terms of pre-selected optimization criteria. There are many practical applications of MS, such as model parameter tuning, personalized recommendations, A/B testing, etc. Lately, some MS research has focused on trading off exactness of the optimization with somewhat alleviating the computational...
Show moreModel Selection (MS) is an important aspect of machine learning, as necessitated by the No Free Lunch theorem. Briefly speaking, the task of MS is to identify a subset of models that are optimal in terms of pre-selected optimization criteria. There are many practical applications of MS, such as model parameter tuning, personalized recommendations, A/B testing, etc. Lately, some MS research has focused on trading off exactness of the optimization with somewhat alleviating the computational burden entailed. Recent attempts along this line include metaheuristics optimization, local search-based approaches, sequential model-based methods, portfolio algorithm approaches, and multi-armed bandits.Racing Algorithms (RAs) are an active research area in MS, which trade off some computational cost for a reduced, but acceptable likelihood that the models returned are indeed optimal among the given ensemble of models. All existing RAs in the literature are designed as Single-Objective Racing Algorithm (SORA) for Single-Objective Model Selection (SOMS), where a single optimization criterion is considered for measuring the goodness of models. Moreover, they are offline algorithms in which MS occurs before model deployment and the selected models are optimal in terms of their overall average performances on a validation set of problem instances. This work aims to investigate racing approaches along two distinct directions: Extreme Model Selection (EMS) and Multi-Objective Model Selection (MOMS). In EMS, given a problem instance and a limited computational budget shared among all the candidate models, one is interested in maximizing the final solution quality. In such a setting, MS occurs during model comparison in terms of maximum performance and involves no model validation. EMS is a natural framework for many applications. However, EMS problems remain unaddressed by current racing approaches. In this work, the first RA for EMS, named Max-Race, is developed, so that it optimizes the extreme solution quality by automatically allocating the computational resources among an ensemble of problem solvers for a given problem instance. In Max-Race, significant difference between the extreme performances of any pair of models is statistically inferred via a parametric hypothesis test under the Generalized Pareto Distribution (GPD) assumption. Experimental results have confirmed that Max-Race is capable of identifying the best extreme model with high accuracy and low computational cost. Furthermore, in machine learning, as well as in many real-world applications, a variety of MS problems are multi-objective in nature. MS which simultaneously considers multiple optimization criteria is referred to as MOMS. Under this scheme, a set of Pareto optimal models is sought that reflect a variety of compromises between optimization objectives. So far, MOMS problems have received little attention in the relevant literature. Therefore, this work also develops the first Multi-Objective Racing Algorithm (MORA) for a fixed-budget setting, namely S-Race. S-Race addresses MOMS in the proper sense of Pareto optimality. Its key decision mechanism is the non-parametric sign test, which is employed for inferring pairwise dominance relationships. Moreover, S-Race is able to strictly control the overall probability of falsely eliminating any non-dominated models at a user-specified significance level. Additionally, SPRINT-Race, the first MORA for a fixed-confidence setting, is also developed. In SPRINT-Race, pairwise dominance and non-dominance relationships are established via the Sequential Probability Ratio Test with an Indifference zone. Moreover, the overall probability of falsely eliminating any non-dominated models or mistakenly retaining any dominated models is controlled at a prescribed significance level. Extensive experimental analysis has demonstrated the efficiency and advantages of both S-Race and SPRINT-Race in MOMS.
Show less - Date Issued
- 2016
- Identifier
- CFE0006203, ucf:51094
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006203
- Title
- Modeling User Transportation Patterns Using Mobile Devices.
- Creator
-
Davami, Erfan, Sukthankar, Gita, Gonzalez, Avelino, Foroosh, Hassan, Sukthankar, Rahul, University of Central Florida
- Abstract / Description
-
Participatory sensing frameworks use humans and their computing devices as a large mobile sensing network. Dramatic accessibility and affordability have turned mobile devices (smartphone and tablet computers) into the most popular computational machines in the world, exceeding laptops. By the end of 2013, more than 1.5 billion people on earth will have a smartphone. Increased coverage and higher speeds of cellular networks have given these devices the power to constantly stream large amounts...
Show moreParticipatory sensing frameworks use humans and their computing devices as a large mobile sensing network. Dramatic accessibility and affordability have turned mobile devices (smartphone and tablet computers) into the most popular computational machines in the world, exceeding laptops. By the end of 2013, more than 1.5 billion people on earth will have a smartphone. Increased coverage and higher speeds of cellular networks have given these devices the power to constantly stream large amounts of data.Most mobile devices are equipped with advanced sensors such as GPS, cameras, and microphones. This expansion of smartphone numbers and power has created a sensing system capable of achieving tasks practically impossible for conventional sensing platforms. One of the advantages of participatory sensing platforms is their mobility, since human users are often in motion. This dissertation presents a set of techniques for modeling and predicting user transportation patterns from cell-phone and social media check-ins. To study large-scale transportation patterns, I created a mobile phone app, Kpark, for estimating parking lot occupancy on the UCF campus. Kpark aggregates individual user reports on parking space availability to produce a global picture across all the campus lots using crowdsourcing. An issue with crowdsourcing is the possibility of receiving inaccurate information from users, either through error or malicious motivations. One method of combating this problem is to model the trustworthiness of individual participants to use that information to selectively include or discard data.This dissertation presents a comprehensive study of the performance of different worker quality and data fusion models with plausible simulated user populations, as well as an evaluation of their performance on the real data obtained from a full release of the Kpark app on the UCF Orlando campus. To evaluate individual trust prediction methods, an algorithm selection portfolio was introduced to take advantage of the strengths of each method and maximize the overall prediction performance.Like many other crowdsourced applications, user incentivization is an important aspect of creating a successful crowdsourcing workflow. For this project a form of non-monetized incentivization called gamification was used in order to create competition among users with the aim of increasing the quantity and quality of data submitted to the project. This dissertation reports on the performance of Kpark at predicting parking occupancy, increasing user app usage, and predicting worker quality.
Show less - Date Issued
- 2015
- Identifier
- CFE0005597, ucf:50258
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005597
- Title
- STATISTICAL ANALYSIS OF VISIBLE ABSORPTION SPECTRA AND MASS SPECTRA OBTAINED FROM DYED TEXTILE FIBERS.
- Creator
-
White, Katie, Sigman, Michael, University of Central Florida
- Abstract / Description
-
The National Academy of Sciences recently published a report which calls for improvements to the field of forensic science. Their report criticized many forensic disciplines for failure to establish rigorously-tested methods of comparison, and encouraged more research in these areas to establish limitations and assess error rates. This study applies chemometric and statistical methods to current and developing analytical techniques in fiber analysis. In addition to analysis of commercially...
Show moreThe National Academy of Sciences recently published a report which calls for improvements to the field of forensic science. Their report criticized many forensic disciplines for failure to establish rigorously-tested methods of comparison, and encouraged more research in these areas to establish limitations and assess error rates. This study applies chemometric and statistical methods to current and developing analytical techniques in fiber analysis. In addition to analysis of commercially available dyed textile fibers, two pairs of dyes are selected for custom fabric dyeing based on the similarities of their absorbance spectra and dye molecular structures. Visible absorption spectra for all fiber samples are collected using microspectrophotometry (MSP) and mass spectra are collected using electrospray ionization (ESI) mass spectrometry. Statistical calculations are performed using commercial software packages and software written in-house. Levels of Type I and Type II error are examined for fiber discrimination based on hypothesis testing of visible absorbance spectra profiles using a nonparametric permutation method. This work also explores evaluation of known and questioned fiber populations based on an assessment of statistical p-value distributions from questioned-known fiber comparisons with those of known fiber self-comparisons. Results from the hypothesis testing are compared with principal components analysis (PCA) and discriminant analysis (DA) of visible absorption spectra, as well as PCA and DA of ESI mass spectra. The sensitivity of a statistical approach will also be discussed in terms of how instrumental parameters and sampling methods may influence error rates.
Show less - Date Issued
- 2010
- Identifier
- CFE0003454, ucf:48396
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0003454
- Title
- STRUCTURAL HEALTH MONITORING WITH EMPHASIS ON COMPUTER VISION, DAMAGE INDICES, AND STATISTICAL ANALYSIS.
- Creator
-
ZAURIN, RICARDO, CATBAS, F. NECATI, University of Central Florida
- Abstract / Description
-
Structural Health Monitoring (SHM) is the sensing and analysis of a structure to detect abnormal behavior, damage and deterioration during regular operations as well as under extreme loadings. SHM is designed to provide objective information for decision-making on safety and serviceability. This research focuses on the SHM of bridges by developing and integrating novel methods and techniques using sensor networks, computer vision, modeling for damage indices and statistical approaches....
Show moreStructural Health Monitoring (SHM) is the sensing and analysis of a structure to detect abnormal behavior, damage and deterioration during regular operations as well as under extreme loadings. SHM is designed to provide objective information for decision-making on safety and serviceability. This research focuses on the SHM of bridges by developing and integrating novel methods and techniques using sensor networks, computer vision, modeling for damage indices and statistical approaches. Effective use of traffic video synchronized with sensor measurements for decision-making is demonstrated. First, some of the computer vision methods and how they can be used for bridge monitoring are presented along with the most common issues and some practical solutions. Second, a conceptual damage index (Unit Influence Line) is formulated using synchronized computer images and sensor data for tracking the structural response under various load conditions. Third, a new index, Nd , is formulated and demonstrated to more effectively identify, localize and quantify damage. Commonly observed damage conditions on real bridges are simulated on a laboratory model for the demonstration of the computer vision method, UIL and the new index. This new method and the index, which are based on outlier detection from the UIL population, can very effectively handle large sets of monitoring data. The methods and techniques are demonstrated on the laboratory model for damage detection and all damage scenarios are identified successfully. Finally, the application of the proposed methods on a real life structure, which has a monitoring system, is presented. It is shown that these methods can be used efficiently for applications such as damage detection and load rating for decision-making. The results from this monitoring project on a movable bridge are demonstrated and presented along with the conclusions and recommendations for future work.
Show less - Date Issued
- 2009
- Identifier
- CFE0002890, ucf:48039
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0002890
- Title
- Data-Driven Modeling and Optimization of Building Energy Consumption.
- Creator
-
Grover, Divas, Pourmohammadi Fallah, Yaser, Vosoughi, Azadeh, Zhou, Qun, University of Central Florida
- Abstract / Description
-
Sustainability and reducing energy consumption are targets for building operations. The installation of smart sensors and Building Automation Systems (BAS) makes it possible to study facility operations under different circumstances. These technologies generate large amounts of data. That data can be scrapped and used for the analysis. In this thesis, we focus on the process of data-driven modeling and decision making from scraping the data to simulate the building and optimizing the...
Show moreSustainability and reducing energy consumption are targets for building operations. The installation of smart sensors and Building Automation Systems (BAS) makes it possible to study facility operations under different circumstances. These technologies generate large amounts of data. That data can be scrapped and used for the analysis. In this thesis, we focus on the process of data-driven modeling and decision making from scraping the data to simulate the building and optimizing the operation. The City of Orlando has similar goals of sustainability and reduction of energy consumption so, they provided us access to their BAS for the data and study the operation of its facilities. The data scraped from the City's BAS serves can be used to develop statistical/machine learning methods for decision making. We selected a mid-size pilot building to apply these techniques. The process begins with the collection of data from BAS. An Application Programming Interface (API) is developed to login to the servers and scrape data for all data points and store it on the local machine. Then data is cleaned to analyze and model. The dataset contains various data points ranging from indoor and outdoor temperature to fan speed inside the Air Handling Unit (AHU) which are operated by Variable Frequency Drive (VFD). This whole dataset is a time series and is handled accordingly. The cleaned dataset is analyzed to find different patterns and investigate relations between different data points. The analysis helps us in choosing parameters for models that are developed in the next step. Different statistical models are developed to simulate building and equipment behavior. Finally, the models along with the data are used to optimize the building Operation with the equipment constraints to make decisions for building operation which leads to a reduction in energy consumption while maintaining temperature and pressure inside the building.
Show less - Date Issued
- 2019
- Identifier
- CFE0007810, ucf:52335
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007810
- Title
- An Optimization of Thermodynamic Efficiency vs. Capacity for Communications Systems.
- Creator
-
Rawlins, Gregory, Wocjan, Pawel, Wahid, Parveen, Georgiopoulos, Michael, Jones, W Linwood, Mucciolo, Eduardo, University of Central Florida
- Abstract / Description
-
This work provides a fundamental view of the mechanisms which affect the power efficiency of communications processes along with a method for efficiency enhancement. Shannon's work is the definitive source for analyzing information capacity of a communications system but his formulation does not predict an efficiency relationship suitable for calculating the power consumption of a system, particularly for practical signals which may only approach the capacity limit. This work leverages...
Show moreThis work provides a fundamental view of the mechanisms which affect the power efficiency of communications processes along with a method for efficiency enhancement. Shannon's work is the definitive source for analyzing information capacity of a communications system but his formulation does not predict an efficiency relationship suitable for calculating the power consumption of a system, particularly for practical signals which may only approach the capacity limit. This work leverages Shannon's while providing additional insight through physical models which enable the calculation and improvement of efficiency for the encoding of signals. The proliferation of Mobile Communications platforms is challenging capacity of networks largely because of the ever increasing data rate at each node. This places significant power management demands on personal computing devices as well as cellular and WLAN terminals. The increased data throughput translates to shorter meantime between battery charging cycles and increased thermal footprint. Solutions are developed herein to counter this trend. Hardware was constructed to measure the efficiency of a prototypical Gaussian signal prior to efficiency enhancement. After an optimization was performed, the efficiency of the encoding apparatus increased from 3.125% to greater than 86% for a manageable investment of resources. Likewise several telecommunications standards based waveforms were also tested on the same hardware. The results reveal that the developed physical theories extrapolate in a very accurate manner to an electronics application, predicting the efficiency of single ended and differential encoding circuits before and after optimization.
Show less - Date Issued
- 2015
- Identifier
- CFE0006051, ucf:50994
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006051
- Title
- THEORETICAL AND NUMERICAL STUDIES OF PHASE TRANSITIONS AND ERROR THRESHOLDS IN TOPOLOGICAL QUANTUM MEMORIES.
- Creator
-
Jouzdani, Pejman, Mucciolo, Eduardo, Chang, Zenghu, Leuenberger, Michael, Abouraddy, Ayman, University of Central Florida
- Abstract / Description
-
This dissertation is the collection of a progressive research on the topic of topological quantum computation and information with the focus on the error threshold of the well-known models such as the unpaired Majorana, the toric code, and the planar code.We study the basics of quantum computation and quantum information, and in particular quantum error correction. Quantum error correction provides a tool for enhancing the quantum computation fidelity in the noisy environment of a real world....
Show moreThis dissertation is the collection of a progressive research on the topic of topological quantum computation and information with the focus on the error threshold of the well-known models such as the unpaired Majorana, the toric code, and the planar code.We study the basics of quantum computation and quantum information, and in particular quantum error correction. Quantum error correction provides a tool for enhancing the quantum computation fidelity in the noisy environment of a real world. We begin with a brief introduction to stabilizer codes. The stabilizer formalism of the theory of quantum error correction gives a well-defined description of quantum codes that is used throughout this dissertation. Then, we turn our attention to a quite new subject, namely, topological quantum codes. Topological quantum codes take advantage of the topological characteristics of a physical many-body system. The physical many-body systems studied in the context of topological quantum codes are of two essential natures: they either have intrinsic interaction that self-corrects errors, or are actively corrected to be maintainedin a desired quantum state. Examples of the former are the toric code and the unpaired Majorana, while an example for the latter is the surface code.A brief introduction and history of topological phenomena in condensed matter is provided. The unpaired Majorana and the Kitaev toy model are briefly explained. Later we introduce a spin model that maps onto the Kitaev toy model through a sequence of transformations. We show how this model is robust and tolerates local perturbations. The research on this topic, at the time of writing this dissertation, is still incomplete and only preliminary results are represented.As another example of passive error correcting codes with intrinsic Hamiltonian, the toric code is introduced. We also analyze the dynamics of the errors in the toric code known as anyons. We show numerically how the addition of disorder to the physical system underlying the toric code slows down the dynamics of the anyons. We go further and numerically analyze the presence of time-dependent noise and the consequent delocalization of localized errors.The main portion of this dissertation is dedicated to the surface code. We study the surface code coupled to a non-interacting bosonic bath. We show how the interaction between the code and the bosonic bath can effectively induce correlated errors. These correlated errors may be corrected up to some extend. The extension beyond which quantum error correction seems impossible is the error threshold of the code. This threshold is analyzed by mapping the effective correlated error model onto a statistical model. We then study the phase transition in the statistical model. The analysis is in two parts. First, we carry out derivation of the effective correlated model, its mapping onto a statistical model, and perform an exact numerical analysis. Second, we employ a Monte Carlo method to extend the numerical analysis to large system size.We also tackle the problem of surface code with correlated and single-qubit errors by an exact mapping onto a two-dimensional Ising model with boundary fields. We show how the phase transition point in one model, the Ising model, coincides with the intrinsic error threshold of the other model, the surface code.
Show less - Date Issued
- 2014
- Identifier
- CFE0005512, ucf:50314
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005512
- Title
- THE EFFECT OF UNEMPLOYMENT ON DEMOCRATIC WARFARE.
- Creator
-
Rakower, Andres, Vasquez, Joseph Paul, Kang, Kyungkook, University of Central Florida
- Abstract / Description
-
This study was done to see the effects of a war on the economy and the internal politics of the United States. In selecting the engagement, we would study we agreed the Iraq War would be aided by a large amount of sampling of public opinion that was more nuanced than in previous wars. The Iraq War was a very complicated war, as it was controversial from the beginning and became a political issue while continuing to be a war fought by Americans abroad. Based on the literature, there were many...
Show moreThis study was done to see the effects of a war on the economy and the internal politics of the United States. In selecting the engagement, we would study we agreed the Iraq War would be aided by a large amount of sampling of public opinion that was more nuanced than in previous wars. The Iraq War was a very complicated war, as it was controversial from the beginning and became a political issue while continuing to be a war fought by Americans abroad. Based on the literature, there were many starting effects and assumptions that were accounted for such as the 'rally round the flag effect.' As a historical landmark, the Iraq War is important for being a significant conflict after the Vietnam War, another very controversial conflict in the eyes of the American public. The hypothesis that I presented were not supported by the data. The impact of the war on the economy was not strong enough that it would create pressure for the sort of model I created to apply. In this model the economic problems faced domestically could lead to more unemployment and therefore to higher military recruitment rates. While this was partially true in 2008, the consequence was not a significantly higher amount of people in the military. Ultimately, this project requires to be done in a more thorough setting where effects may be compared with those of other similar countries in similar scenarios.
Show less - Date Issued
- 2018
- Identifier
- CFH2000435, ucf:45816
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFH2000435
- Title
- Harnessing Spatial Intensity Fluctuations for Optical Imaging and Sensing.
- Creator
-
Akhlaghi Bouzan, Milad, Dogariu, Aristide, Saleh, Bahaa, Pang, Sean, Atia, George, University of Central Florida
- Abstract / Description
-
Properties of light such as amplitude and phase, temporal and spatial coherence, polarization, etc. are abundantly used for sensing and imaging. Regardless of the passive or active nature of the sensing method, optical intensity fluctuations are always present! While these fluctuations are usually regarded as noise, there are situations where one can harness the intensity fluctuations to enhance certain attributes of the sensing procedure. In this thesis, we developed different sensing...
Show moreProperties of light such as amplitude and phase, temporal and spatial coherence, polarization, etc. are abundantly used for sensing and imaging. Regardless of the passive or active nature of the sensing method, optical intensity fluctuations are always present! While these fluctuations are usually regarded as noise, there are situations where one can harness the intensity fluctuations to enhance certain attributes of the sensing procedure. In this thesis, we developed different sensing methodologies that use statistical properties of optical fluctuations for gauging specific information. We examine this concept in the context of three different aspects of computational optical imaging and sensing. First, we study imposing specific statistical properties to the probing field to image or characterize certain properties of an object through a statistical analysis of the spatially integrated scattered intensity. This offers unique capabilities for imaging and sensing techniques operating in highly perturbed environments and low-light conditions. Next, we examine optical sensing in the presence of strong perturbations that preclude any controllable field modification. We demonstrate that inherent properties of diffused coherent fields and fluctuations of integrated intensity can be used to track objects hidden behind obscurants. Finally, we address situations where, due to coherent noise, image accuracy is severely degraded by intensity fluctuations. By taking advantage of the spatial coherence properties of optical fields, we show that this limitation can be effectively mitigated and that a significant improvement in the signal-to-noise ratio can be achieved even in one single-shot measurement. The findings included in this dissertation illustrate different circumstances where optical fluctuations can affect the efficacy of computational optical imaging and sensing. A broad range of applications, including biomedical imaging and remote sensing, could benefit from the new approaches to suppress, enhance, and exploit optical fluctuations, which are described in this dissertation.
Show less - Date Issued
- 2017
- Identifier
- CFE0007274, ucf:52200
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007274
- Title
- The AfterMath: A Culturally Responsive Mathematical Intervention to Aid Students Affected by Natural Disasters.
- Creator
-
Kurtz, Brianna, Haciomeroglu, Erhan, Bush, Sarah, Safi, Farshid, Biraimah, Karen, University of Central Florida
- Abstract / Description
-
On September 20, 2017, Hurricane Maria struck the island of Puerto Rico. The damage was extensive, and many people found themselves to be natural disaster refugees. As a result, schools in Central Florida saw an influx of new students who had their educations interrupted by the disaster and now were resuming school in a new language of instruction. These students not only faced linguistic challenges but also academic differences due to the high prevalence of poverty and the effects of...
Show moreOn September 20, 2017, Hurricane Maria struck the island of Puerto Rico. The damage was extensive, and many people found themselves to be natural disaster refugees. As a result, schools in Central Florida saw an influx of new students who had their educations interrupted by the disaster and now were resuming school in a new language of instruction. These students not only faced linguistic challenges but also academic differences due to the high prevalence of poverty and the effects of neocolonialism in their previous schooling. This mixed methods study implemented an intensive intervention in probability to aid students in developing mathematical understanding and forming meaningful connections. Student participants, who had been affected by Hurricane Maria, were now attending a public high school and were paired one-on-one with a bilingual, mathematically high performing student mentor to complete culturally responsive, bilingual probability tasks. Data collection occurred over the course of six weeks in fall 2019. Both mentor and mentee students participated in focus group interviews, and the mentees completed a probability pre-test and post-test. Student participants were found to have statistically significant increases in the understanding of probability concepts when comparing pre-intervention and post-intervention results, with the understanding and usage of the multiplication rule showing the most significant improvement. Both mentors and mentees reported feeling a stronger sense of unity and belonging post-intervention as well as improvement in bilingual academic vocabulary. With the impact of natural disasters on the rise, implications of this study include its adaption to respond to future displaced students as they resume schooling post-interruption in Central Florida and beyond.
Show less - Date Issued
- 2019
- Identifier
- CFE0007828, ucf:52820
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007828
- Title
- Exploration and development of crash modification factors and functions for single and multiple treatments.
- Creator
-
Park, Juneyoung, Abdel-Aty, Mohamed, Radwan, Essam, Eluru, Naveen, Wang, Chung-Ching, Lee, JaeYoung, University of Central Florida
- Abstract / Description
-
Traffic safety is a major concern for the public, and it is an important component of the roadway management strategy. In order to improve highway safety, extensive efforts have been made by researchers, transportation engineers, Federal, State, and local government officials. With these consistent efforts, both fatality and injury rates from road traffic crashes in the United States have been steadily declining over the last six years (2006~2011). However, according to the National Highway...
Show moreTraffic safety is a major concern for the public, and it is an important component of the roadway management strategy. In order to improve highway safety, extensive efforts have been made by researchers, transportation engineers, Federal, State, and local government officials. With these consistent efforts, both fatality and injury rates from road traffic crashes in the United States have been steadily declining over the last six years (2006~2011). However, according to the National Highway Traffic Safety Administration (NHTSA, 2013), 33,561 people died in motor vehicle traffic crashes in the United States in 2012, compared to 32,479 in 2011, and it is the first increase in fatalities since 2005. Moreover, in 2012, an estimated 2.36 million people were injured in motor vehicle traffic crashes, compared to 2.22 million in 2011. Due to the demand of highway safety improvements through systematic analysis of specific roadway cross-section elements and treatments, the Highway Safety Manual (HSM) (AASHTO, 2010) was developed by the Transportation Research Board (TRB) to introduce a science-based technical approach for safety analysis. One of the main parts in the HSM, Part D, contains crash modification factors (CMFs) for various treatments on roadway segments and at intersections. A CMF is a factor that can estimate potential changes in crash frequency as a result of implementing a specific treatment (or countermeasure). CMFs in Part D have been developed using high-quality observational before-after studies that account for the regression to the mean threat. Observational before-after studies are the most common methods for evaluating safety effectiveness and calculating CMFs of specific roadway treatments. Moreover, cross-sectional method has commonly been used to derive CMFs since it is easier to collect the data compared to before-after methods.Although various CMFs have been calculated and introduced in the HSM, still there are critical limitations that are required to be investigated. First, the HSM provides various CMFs for single treatments, but not CMFs for multiple treatments to roadway segments. The HSM suggests that CMFs are multiplied to estimate the combined safety effects of single treatments. However, the HSM cautions that the multiplication of the CMFs may over- or under-estimate combined effects of multiple treatments. In this dissertation, several methodologies are proposed to estimate more reliable combined safety effects in both observational before-after studies and the cross-sectional method. Averaging two best combining methods is suggested to use to account for the effects of over- or under- estimation. Moreover, it is recommended to develop adjustment factor and function (i.e. weighting factor and function) to apply to estimate more accurate safety performance in assessing safety effects of multiple treatments. The multivariate adaptive regression splines (MARS) modeling is proposed to avoid the over-estimation problem through consideration of interaction impacts between variables in this dissertation. Second, the variation of CMFs with different roadway characteristics among treated sites over time is ignored because the CMF is a fixed value that represents the overall safety effect of the treatment for all treated sites for specific time periods. Recently, few studies developed crash modification functions (CMFunctions) to overcome this limitation. However, although previous studies assessed the effect of a specific single variable such as AADT on the CMFs, there is a lack of prior studies on the variation in the safety effects of treated sites with different multiple roadway characteristics over time. In this study, adopting various multivariate linear and nonlinear modeling techniques is suggested to develop CMFunctions. Multiple linear regression modeling can be utilized to consider different multiple roadway characteristics. To reflect nonlinearity of predictors, a regression model with nonlinearizing link function needs to be developed. The Bayesian approach can also be adopted due to its strength to avoid the problem of over fitting that occurs when the number of observations is limited and the number of variables is large. Moreover, two data mining techniques (i.e. gradient boosting and MARS) are suggested to use 1) to achieve better performance of CMFunctions with consideration of variable importance, and 2) to reflect both nonlinear trend of predictors and interaction impacts between variables at the same time. Third, the nonlinearity of variables in the cross-sectional method is not discussed in the HSM. Generally, the cross-sectional method is also known as safety performance functions (SPFs) and generalized linear model (GLM) is applied to estimate SPFs. However, the estimated CMFs from GLM cannot account for the nonlinear effect of the treatment since the coefficients in the GLM are assumed to be fixed. In this dissertation, applications of using generalized nonlinear model (GNM) and MARS in the cross-sectional method are proposed. In GNMs, the nonlinear effects of independent variables to crash analysis can be captured by the development of nonlinearizing link function. Moreover, the MARS accommodate nonlinearity of independent variables and interaction effects for complex data structures. In this dissertation, the CMFs and CMFunctions are estimated for various single and combination of treatments for different roadway types (e.g. rural two-lane, rural multi-lane roadways, urban arterials, freeways, etc.) as below:1) Treatments for mainline of roadway: - adding a thru lane, conversion of 4-lane undivided roadways to 3-lane with two-way left turn lane (TWLTL)2) Treatments for roadway shoulder: - installing shoulder rumble strips, widening shoulder width, adding bike lanes, changing bike lane width, installing roadside barriers3) Treatments related to roadside features: - decrease density of driveways, decrease density of roadside poles, increase distance to roadside poles, increase distance to trees Expected contributions of this study are to 1) suggest approaches to estimate more reliable safety effects of multiple treatments, 2) propose methodologies to develop CMFunctions to assess the variation of CMFs with different characteristics among treated sites, and 3) recommend applications of using GNM and MARS to simultaneously consider the interaction impact of more than one variables and nonlinearity of predictors.Finally, potential relevant applications beyond the scope of this research but worth investigation in the future are discussed in this dissertation.
Show less - Date Issued
- 2015
- Identifier
- CFE0005861, ucf:50914
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005861
- Title
- Multi-Level Safety Performance Functions for High Speed Facilities.
- Creator
-
Ahmed, Mohamed, Abdel-Aty, Mohamed, Radwan, Ahmed, Al-Deek, Haitham, Mackie, Kevin, Pande, Anurag, Uddin, Nizam, University of Central Florida
- Abstract / Description
-
High speed facilities are considered the backbone of any successful transportation system; Interstates, freeways, and expressways carry the majority of daily trips on the transportation network. Although these types of roads are relatively considered the safest among other types of roads, they still experience many crashes, many of which are severe, which not only affect human lives but also can have tremendous economical and social impacts. These facts signify the necessity of enhancing the...
Show moreHigh speed facilities are considered the backbone of any successful transportation system; Interstates, freeways, and expressways carry the majority of daily trips on the transportation network. Although these types of roads are relatively considered the safest among other types of roads, they still experience many crashes, many of which are severe, which not only affect human lives but also can have tremendous economical and social impacts. These facts signify the necessity of enhancing the safety of these high speed facilities to ensure better and efficient operation. Safety problems could be assessed through several approaches that can help in mitigating the crash risk on long and short term basis. Therefore, the main focus of the research in this dissertation is to provide a framework of risk assessment to promote safety and enhance mobility on freeways and expressways. Multi-level Safety Performance Functions (SPFs) were developed at the aggregate level using historical crash data and the corresponding exposure and risk factors to identify and rank sites with promise (hot-spots). Additionally, SPFs were developed at the disaggregate level utilizing real-time weather data collected from meteorological stations located at the freeway section as well as traffic flow parameters collected from different detection systems such as Automatic Vehicle Identification (AVI) and Remote Traffic Microwave Sensors (RTMS). These disaggregate SPFs can identify real-time risks due to turbulent traffic conditions and their interactions with other risk factors.In this study, two main datasets were obtained from two different regions. Those datasets comprise historical crash data, roadway geometrical characteristics, aggregate weather and traffic parameters as well as real-time weather and traffic data.At the aggregate level, Bayesian hierarchical models with spatial and random effects were compared to Poisson models to examine the safety effects of roadway geometrics on crash occurrence along freeway sections that feature mountainous terrain and adverse weather. At the disaggregate level; a main framework of a proactive safety management system using traffic data collected from AVI and RTMS, real-time weather and geometrical characteristics was provided. Different statistical techniques were implemented. These techniques ranged from classical frequentist classification approaches to explain the relationship between an event (crash) occurring at a given time and a set of risk factors in real time to other more advanced models. Bayesian statistics with updating approach to update beliefs about the behavior of the parameter with prior knowledge in order to achieve more reliable estimation was implemented. Also a relatively recent and promising Machine Learning technique (Stochastic Gradient Boosting) was utilized to calibrate several models utilizing different datasets collected from mixed detection systems as well as real-time meteorological stations. The results from this study suggest that both levels of analyses are important, the aggregate level helps in providing good understanding of different safety problems, and developing policies and countermeasures to reduce the number of crashes in total. At the disaggregate level, real-time safety functions help toward more proactive traffic management system that will not only enhance the performance of the high speed facilities and the whole traffic network but also provide safer mobility for people and goods. In general, the proposed multi-level analyses are useful in providing roadway authorities with detailed information on where countermeasures must be implemented and when resources should be devoted. The study also proves that traffic data collected from different detection systems could be a useful asset that should be utilized appropriately not only to alleviate traffic congestion but also to mitigate increased safety risks. The overall proposed framework can maximize the benefit of the existing archived data for freeway authorities as well as for road users.
Show less - Date Issued
- 2012
- Identifier
- CFE0004508, ucf:49274
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004508
- Title
- Chemometric Applications to a Complex Classification Problem: Forensic Fire Debris Analysis.
- Creator
-
Waddell, Erin, Sigman, Michael, Belfield, Kevin, Campiglia, Andres, Yestrebsky, Cherie, Ni, Liqiang, University of Central Florida
- Abstract / Description
-
Fire debris analysis currently relies on visual pattern recognition of the total ion chromatograms, extracted ion profiles, and target compound chromatograms to identify the presence of an ignitable liquid according to the ASTM International E1618-10 standard method. For large data sets, this methodology can be time consuming and is a subjective method, the accuracy of which is dependent upon the skill and experience of the analyst. This research aimed to develop an automated classification...
Show moreFire debris analysis currently relies on visual pattern recognition of the total ion chromatograms, extracted ion profiles, and target compound chromatograms to identify the presence of an ignitable liquid according to the ASTM International E1618-10 standard method. For large data sets, this methodology can be time consuming and is a subjective method, the accuracy of which is dependent upon the skill and experience of the analyst. This research aimed to develop an automated classification method for large data sets and investigated the use of the total ion spectrum (TIS). The TIS is calculated by taking an average mass spectrum across the entire chromatographic range and has been shown to contain sufficient information content for the identification of ignitable liquids. The TIS of ignitable liquids and substrates, defined as common building materials and household furnishings, were compiled into model data sets. Cross-validation (CV) and fire debris samples, obtained from laboratory-scale and large-scale burns, were used to test the models. An automated classification method was developed using computational software, written in-house, that considers a multi-step classification scheme to detect ignitable liquid residues in fire debris samples and assign these to the classes defined in ASTM E1618-10. Classifications were made using linear discriminant analysis, quadratic discriminant analysis (QDA), and soft independent modeling of class analogy (SIMCA). Overall, the highest correct classification rates were achieved using QDA for the first step of the scheme and SIMCA for the remaining steps. In the first step of the classification scheme, correct classification rates of 95.3% and 89.2% were obtained for the CV test set and fire debris samples, respectively. Correct classifications rates of 100% were achieved for both data sets in the majority of the remaining steps which used SIMCA for classification. In this research, the first statistically valid error rates for fire debris analysis have been developed through cross-validation of large data sets. The error rates reduce the subjectivity associated with the current methods and provide a level of confidence in sample classification that does not currently exist in forensic fire debris analysis.
Show less - Date Issued
- 2013
- Identifier
- CFE0004954, ucf:49586
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004954
- Title
- Photon Statistics in Disordered Lattices.
- Creator
-
Kondakci, Hasan, Saleh, Bahaa, Abouraddy, Ayman, Christodoulides, Demetrios, Mucciolo, Eduardo, University of Central Florida
- Abstract / Description
-
Propagation of coherent waves through disordered media, whether optical, acoustic, or radio waves, results in a spatially redistributed random intensity pattern known as speckle -- a statistical phenomenon. The subject of this dissertation is the statistics of monochromatic coherent light traversing disordered photonic lattices and its dependence on the disorder class, the level of disorder and the excitation configuration at the input. Throughout the dissertation, two disorder classes are...
Show morePropagation of coherent waves through disordered media, whether optical, acoustic, or radio waves, results in a spatially redistributed random intensity pattern known as speckle -- a statistical phenomenon. The subject of this dissertation is the statistics of monochromatic coherent light traversing disordered photonic lattices and its dependence on the disorder class, the level of disorder and the excitation configuration at the input. Throughout the dissertation, two disorder classes are considered, namely, diagonal and off-diagonal disorders. The latter exhibits disorder-immune chiral symmetry -- the appearance of the eigenmodes in skew-symmetric pairs and the corresponding eigenvalues in opposite signs. When a disordered photonic lattice, an array of evanescently coupled waveguides, is illuminated with an extended coherent optical field, discrete speckle develops. Numerical simulations and analytical modeling reveal that discrete speckle shows a set of surprising features, that are qualitatively indistinguishable in both disorder classes. First, the fingerprint of transverse Anderson localization -- associated with disordered lattices, is exhibited in the narrowing of the spatial coherence function. Second, the transverse coherence length (or speckle grain size) freezes upon propagation. Third, the axial coherence depth is independent of the axial position, thereby resulting in a coherence voxel of fixed volume independently of position.When a single lattice site is coherently excited, I discovered that a thermalization gap emerges for light propagating in disordered lattices endowed with disorder-immune chiral symmetry. In these systems, the span of sub-thermal photon statistics is inaccessible to the input coherent light, which -- once the steady state is reached -- always emerges with super-thermal statistics no matter how small the disorder level. An independent constraint of the input field for the chiral symmetry to be activated and the gap to be observed is formulated. This unique feature enables a new form of photon-statistics interferometry: by exciting two lattice sites with a variable relative phase, as in a traditional two-path interferometer, the excitation-symmetry of the chiral mode pairs is judiciously broken and interferometric control over the photon statistics is exercised, spanning sub-thermal and super-thermal regimes. By considering an ensemble of disorder realizations, this phenomenon is demonstrated experimentally: a deterministic tuning of the intensity fluctuations while the mean intensity remains constant.Finally, I examined the statistics of the emerging light in two different lattice topologies: linear and ring lattices. I showed that the topology dictates the light statistics in the off-diagonal case: for even-sited ring and linear lattices, the electromagnetic field evolves into a single quadrature component, so that the field takes discrete phase values and is non-circular in the complex plane. As a consequence, the statistics become super-thermal. For odd-sited ring lattices, the field becomes random in both quadratures resulting in sub-thermal statistics. However, this effect is suppressed due to the transverse localization of light in lattices with high disorder. In the diagonal case, the lattice topology does not play a role and the transmitted field always acquires random components in both quadratures, hence the phase distribution is uniform in the steady state.
Show less - Date Issued
- 2015
- Identifier
- CFE0005968, ucf:50786
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005968
- Title
- STATISTICAL ANALYSIS OF DEPRESSION AND SOCIAL SUPPORT CHANGE IN ARAB IMMIGRANT WOMEN IN USA.
- Creator
-
Blbas, Hazhar, Uddin, Nizam, Nickerson, David, Aroian, Karen, University of Central Florida
- Abstract / Description
-
Arab Muslim immigrant women encounter many stressors and are at risk for depression. Social supports from husbands, family and friends are generally considered mitigating resources for depression. However, changes in social support over time and the effects of such supports on depression at a future time period have not been fully addressed in the literature This thesis investigated the relationship between demographic characteristics, changes in social support, and depression in Arab Muslim...
Show moreArab Muslim immigrant women encounter many stressors and are at risk for depression. Social supports from husbands, family and friends are generally considered mitigating resources for depression. However, changes in social support over time and the effects of such supports on depression at a future time period have not been fully addressed in the literature This thesis investigated the relationship between demographic characteristics, changes in social support, and depression in Arab Muslim immigrant women to the USA. A sample of 454 married Arab Muslim immigrant women provided demographic data, scores on social support variables and depression at three time periods approximately six months apart. Various statistical techniques at our disposal such as boxplots, response curves, descriptive statistics, ANOVA and ANCOVA, simple and multiple linear regressions have been used to see how various factors and variables are associated with changes in social support from husband, extended family and friend over time. Simple and multiple regression analyses are carried out to see if any variable observed at the time of first survey can be used to predict depression at a future time. Social support from husband and friend, husband's employment status and education, and depression at time one are found to be significantly associated with depression at time three. Finally, logistic regression analysis conducted for a binary depression outcome variable indicated that lower total social support and higher depression score of survey participants at the time of first survey increase their probability of being depressed at the time of third survey.
Show less - Date Issued
- 2014
- Identifier
- CFE0005133, ucf:50676
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005133
- Title
- A Framework of Critical Success Factors for Business Organizations that Lead to Performance Excellence Based on a Financial and Quality Systems Assessment.
- Creator
-
Francisco, Melissa, Elshennawy, Ahmad, Karwowski, Waldemar, Rabelo, Luis, Xanthopoulos, Petros, Weheba, Gamal, University of Central Florida
- Abstract / Description
-
One of the most important tasks that business leaders undertake in order to achieve a superior market position is strategic planning. Beyond this obligation, business owners desire to maximize profit and maintain steady growth. In order to do this, resources must be invested in the most efficient way possible in order to achieve performance excellence. Adjusting business operations quickly, however, especially in times of economic uncertainty, is extremely difficult. Business leaders...
Show moreOne of the most important tasks that business leaders undertake in order to achieve a superior market position is strategic planning. Beyond this obligation, business owners desire to maximize profit and maintain steady growth. In order to do this, resources must be invested in the most efficient way possible in order to achieve performance excellence. Adjusting business operations quickly, however, especially in times of economic uncertainty, is extremely difficult. Business leaders therefore need insight into which elements of organizational improvement are most effective in order to strategically invest their resources to achieve superior performance in the most efficient way possible.This research examines the results of companies which have a demonstrated ability to achieve performance excellence as defined by the National Institute of Standards and Technology's Malcolm Baldrige Criteria for Performance Excellence. This research examined award-winning applications to determine common input factors, compared the business results of a subset of those award-winners with the overall market for a time-frame of 11 years, and then investigated the profitability, liquidity, debt management, asset management, and per share performance ratios of award-winners compared with their industry peers over 11 years as well.The main focus of this research is to determine whether participation in performance excellence best practices have created value for shareholders and business owners. This objective is achieved through the analysis of performance results of award winning companies. This research demonstrates that the integration of efforts associated with performance excellence is in-fact advantageous.
Show less - Date Issued
- 2014
- Identifier
- CFE0005331, ucf:50503
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005331
- Title
- The citrus industry and occupations in Florida.
- Creator
-
United States, Mead, Arthur Raymond, PALMM (Project)
- Abstract / Description
-
"Prepared to provide occupational information for youth in high schools, to a lesser degree in colleges and universities, and to out-of-school unemployed." -- Introduction. Gives descriptions of the specific jobs that are performed in the citrus industry, at all stages, from growing to packing or canning the fruit, and at various levels, from laborer to management. Includes statistics on production, trade, and characteristics of citrus industry workers, as well as a discussion of the citrus...
Show more"Prepared to provide occupational information for youth in high schools, to a lesser degree in colleges and universities, and to out-of-school unemployed." -- Introduction. Gives descriptions of the specific jobs that are performed in the citrus industry, at all stages, from growing to packing or canning the fruit, and at various levels, from laborer to management. Includes statistics on production, trade, and characteristics of citrus industry workers, as well as a discussion of the citrus market and excerpts from various documents relating to the industry. Original Date Field: 1938?
Show less - Date Issued
- 1938
- Identifier
- AAA7987QF00010/16/200310/25/200424578BfamIa D0QF, ONICF166- 13, FHP C CF 2003-10-16, FCLA url 20040930xOCLC, 56815654, CF00001661, 2574528, ucf:24113
- Format
- E-book
- PURL
- http://purl.flvc.org/fcla/tc/fhp/CF00001661.jpg
- Title
- Sampling and Subspace Methods for Learning Sparse Group Structures in Computer Vision.
- Creator
-
Jaberi, Maryam, Foroosh, Hassan, Pensky, Marianna, Gong, Boqing, Qi, GuoJun, Pensky, Marianna, University of Central Florida
- Abstract / Description
-
The unprecedented growth of data in volume and dimension has led to an increased number of computationally-demanding and data-driven decision-making methods in many disciplines, such as computer vision, genomics, finance, etc. Research on big data aims to understand and describe trends in massive volumes of high-dimensional data. High volume and dimension are the determining factors in both computational and time complexity of algorithms. The challenge grows when the data are formed of the...
Show moreThe unprecedented growth of data in volume and dimension has led to an increased number of computationally-demanding and data-driven decision-making methods in many disciplines, such as computer vision, genomics, finance, etc. Research on big data aims to understand and describe trends in massive volumes of high-dimensional data. High volume and dimension are the determining factors in both computational and time complexity of algorithms. The challenge grows when the data are formed of the union of group-structures of different dimensions embedded in a high-dimensional ambient space.To address the problem of high volume, we propose a sampling method referred to as the Sparse Withdrawal of Inliers in a First Trial (SWIFT), which determines the smallest sample size in one grab so that all group-structures are adequately represented and discovered with high probability. The key features of SWIFT are: (i) sparsity, which is independent of the population size; (ii) no prior knowledge of the distribution of data, or the number of underlying group-structures; and (iii) robustness in the presence of an overwhelming number of outliers. We report a comprehensive study of the proposed sampling method in terms of accuracy, functionality, and effectiveness in reducing the computational cost in various applications of computer vision. In the second part of this dissertation, we study dimensionality reduction for multi-structural data. We propose a probabilistic subspace clustering method that unifies soft- and hard-clustering in a single framework. This is achieved by introducing a delayed association of uncertain points to subspaces of lower dimensions based on a confidence measure. Delayed association yields higher accuracy in clustering subspaces that have ambiguities, i.e. due to intersections and high-level of outliers/noise, and hence leads to more accurate self-representation of underlying subspaces. Altogether, this dissertation addresses the key theoretical and practically issues of size and dimension in big data analysis.
Show less - Date Issued
- 2018
- Identifier
- CFE0007017, ucf:52039
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007017
- Title
- University Students' Citizenship Shaped by Service-Learning, Community Service, and Peer-to-Peer Civic Discussions.
- Creator
-
Winston, Haley, Cintron Delgado, Rosa, Welch, Kerry, Malaret, Stacey, Bowdon, Melody, University of Central Florida
- Abstract / Description
-
Citizenship is often referred to as the forgotten outcome of colleges and universities. The present study examined the relationship between undergraduate students' perceived citizenship level and different types of civic experiences (service-learning, community service, and peer-to-peer civic discussions) and also different demographic factors (gender, race/ethnicity, and parental level of education) at a public institution using the Personal and Social Responsibility Inventory. This study...
Show moreCitizenship is often referred to as the forgotten outcome of colleges and universities. The present study examined the relationship between undergraduate students' perceived citizenship level and different types of civic experiences (service-learning, community service, and peer-to-peer civic discussions) and also different demographic factors (gender, race/ethnicity, and parental level of education) at a public institution using the Personal and Social Responsibility Inventory. This study used structural equation modeling and multiple regression analysis. This marks the first time these variables have been researched together. This study found a significant correlation between both community service and peer-to-peer civic discussions in relation to citizenship level. Yet, service-learning frequency was not found to be a significant factor. On the other hand, all three civic experiences together was found to be significantly correlated to citizenship aptitudes. Leading the researcher to find that a holistic (both inside and outside the classroom) approach to student citizenship is valuable for student development. Also, only one significant relationship was found between citizenship levels and any demographic variable (parental education level of doctorate or professional degree).
Show less - Date Issued
- 2017
- Identifier
- CFE0006927, ucf:51695
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006927