Current Search: maximum likelihood (x)
View All Items
- Title
- DATA-TRUE CHARACTERIZATION OF NEURONAL MODELS.
- Creator
-
Suarez, Jose, Behal, Aman, University of Central Florida
- Abstract / Description
-
In this thesis, a weighted least squares approach is initially presented to estimate the parameters of an adaptive quadratic neuronal model. By casting the discontinuities in the state variables at the spiking instants as an impulse train driving the system dynamics, the neuronal output is represented as a linearly parameterized model that depends on ltered versions of the input current and the output voltage at the cell membrane. A prediction errorbased weighted least squares method is...
Show moreIn this thesis, a weighted least squares approach is initially presented to estimate the parameters of an adaptive quadratic neuronal model. By casting the discontinuities in the state variables at the spiking instants as an impulse train driving the system dynamics, the neuronal output is represented as a linearly parameterized model that depends on ltered versions of the input current and the output voltage at the cell membrane. A prediction errorbased weighted least squares method is formulated for the model. This method allows for rapid estimation of model parameters under a persistently exciting input current injection. Simulation results show the feasibility of this approach to predict multiple neuronal ring patterns. Results of the method using data from a detailed ion-channel based model showed issues that served as the basis for the more robust resonate-and- re model presented. A second method is proposed to overcome some of the issues found in the adaptive quadratic model presented. The original quadratic model is replaced by a linear resonateand- re model -with stochastic threshold- that is both computational efficient and suitable for larger network simulations. The parameter estimation method presented here consists of different stages where the set of parameters is divided in to two. The rst set of parameters is assumed to represent the subthreshold dynamics of the model, and it is estimated using a nonlinear least squares algorithm, while the second set is associated with the threshold and reset parameters as its estimated using maximum likelihood formulations. The validity of the estimation method is then tested using detailed Hodgkin-Huxley model data as well as experimental voltage recordings from rat motoneurons.
Show less - Date Issued
- 2011
- Identifier
- CFE0003917, ucf:48724
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0003917
- Title
- UNCERTAINTY, IDENTIFICATION, AND PRIVACY: EXPERIMENTS IN INDIVIDUAL DECISION-MAKING.
- Creator
-
Rivenbark, David, Harrison, Glenn, University of Central Florida
- Abstract / Description
-
The alleged privacy paradox states that individuals report high values for personal privacy, while at the same time they report behavior that contradicts a high privacy value. This is a misconception. Reported privacy behaviors are explained by asymmetric subjective beliefs. Beliefs may or may not be uncertain, and non-neutral attitudes towards uncertainty are not necessary to explain behavior. This research was conducted in three related parts. Part one presents an experiment in individual...
Show moreThe alleged privacy paradox states that individuals report high values for personal privacy, while at the same time they report behavior that contradicts a high privacy value. This is a misconception. Reported privacy behaviors are explained by asymmetric subjective beliefs. Beliefs may or may not be uncertain, and non-neutral attitudes towards uncertainty are not necessary to explain behavior. This research was conducted in three related parts. Part one presents an experiment in individual decision making under uncertainty. EllsbergÃÂ's canonical two-color choice problem was used to estimate attitudes towards uncertainty. Subjects believed bets on the color ball drawn from EllsbergÃÂ's ambiguous urn were equally likely to pay. Estimated attitudes towards uncertainty were insignificant. Subjective expected utility explained subjectsÃÂ' choices better than uncertainty aversion and the uncertain priors model. A second treatment tested Vernon SmithÃÂ's conjecture that preferences in EllsbergÃÂ's problem would be unchanged when the ambiguous lottery is replaced by a compound objective lottery. The use of an objective compound lottery to induce uncertainty did not affect subjectsÃÂ' choices. The second part of this dissertation extended the concept of uncertainty to commodities where quality and accuracy of a quality report were potentially ambiguous. The uncertain priors model is naturally extended to allow for potentially different attitudes towards these two sources of uncertainty, quality and accuracy. As they relate to privacy, quality and accuracy of a quality report are seen as metaphors for online security and consumer trust in e-commerce, respectively. The results of parametric structural tests were mixed. Subjects made choices consistent with neutral attitudes towards uncertainty in both the quality and accuracy domains. However, allowing for uncertainty aversion in the quality domain and not the accuracy domain outperformed the alternative which only allowed for uncertainty aversion in the accuracy domain. Finally, part three integrated a public-goods game and punishment opportunities with the Becker-DeGroot-Marschak mechanism to elicit privacy values, replicating previously reported privacy behaviors. The procedures developed elicited punishment (consequence) beliefs and information confidentiality beliefs in the context of individual privacy decisions. Three contributions are made to the literature. First, by using cash rewards as a mechanism to map actions to consequences, the study eliminated hypothetical bias as a confounding behavioral factor which is pervasive in the privacy literature. Econometric results support the ÃÂ"privacy paradoxÃÂ" at levels greater than 10 percent. Second, the roles of asymmetric beliefs and attitudes towards uncertainty were identified using parametric structural likelihood methods. Subjects were, in general, uncertainty neutral and believed ÃÂ"badÃÂ" events were more likely to occur when their private information was not confidential. A third contribution is a partial test to determine which uncertain process, loss of privacy or the resolution of consequences, is of primary importance to individual decision-makers. Choices were consistent with uncertainty neutral preferences in both the privacy and consequences domains.
Show less - Date Issued
- 2010
- Identifier
- CFE0003251, ucf:48539
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0003251
- Title
- Estimation for the Cox Model with Various Types of Censored Data.
- Creator
-
Riddlesworth, Tonya, Ren, Joan, Mohapatra, Ram, Richardson, Gary, Ni, Liqiang, Schott, James, University of Central Florida
- Abstract / Description
-
In survival analysis, the Cox model is one of the most widely used tools. However, up to now there has not been any published work on the Cox model with complicated types of censored data, such as doubly censored data, partly-interval censored data, etc., while these types of censored data have been encountered in important medical studies, such as cancer, heart disease, diabetes, etc. In this dissertation, we first derive the bivariate nonparametric maximum likelihood estimator (BNPMLE) Fn(t...
Show moreIn survival analysis, the Cox model is one of the most widely used tools. However, up to now there has not been any published work on the Cox model with complicated types of censored data, such as doubly censored data, partly-interval censored data, etc., while these types of censored data have been encountered in important medical studies, such as cancer, heart disease, diabetes, etc. In this dissertation, we first derive the bivariate nonparametric maximum likelihood estimator (BNPMLE) Fn(t,z) for joint distribution function Fo(t,z) of survival time T and covariate Z, where T is subject to right censoring, noting that such BNPMLE Fn has not been studied in statistical literature. Then, based on this BNPMLE Fn we derive empirical likelihood-based (Owen, 1988) confidence interval for the conditional survival probabilities, which is an important and difficult problem in statistical analysis, and also has not been studied in literature. Finally, with this BNPMLE Fn as a starting point, we extend the weighted empirical likelihood method (Ren, 2001 and 2008a) to the multivariate case, and obtain a weighted empirical likelihood-based estimation method for the Cox model. Such estimation method is given in a unified form, and is applicable to various types of censored data aforementioned.
Show less - Date Issued
- 2011
- Identifier
- CFE0004158, ucf:49051
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004158
- Title
- APPLICATION OF THE EMPIRICAL LIKELIHOOD METHOD IN PROPORTIONAL HAZARDS MODEL.
- Creator
-
HE, BIN, Ren, Jian-Jian, University of Central Florida
- Abstract / Description
-
In survival analysis, proportional hazards model is the most commonly used and the Cox model is the most popular. These models are developed to facilitate statistical analysis frequently encountered in medical research or reliability studies. In analyzing real data sets, checking the validity of the model assumptions is a key component. However, the presence of complicated types of censoring such as double censoring and partly interval-censoring in survival data makes model assessment...
Show moreIn survival analysis, proportional hazards model is the most commonly used and the Cox model is the most popular. These models are developed to facilitate statistical analysis frequently encountered in medical research or reliability studies. In analyzing real data sets, checking the validity of the model assumptions is a key component. However, the presence of complicated types of censoring such as double censoring and partly interval-censoring in survival data makes model assessment difficult, and the existing tests for goodness-of-fit do not have direct extension to these complicated types of censored data. In this work, we use empirical likelihood (Owen, 1988) approach to construct goodness-of-fit test and provide estimates for the Cox model with various types of censored data.Specifically, the problems under consideration are the two-sample Cox model and stratified Cox model with right censored data, doubly censored data and partly interval-censored data. Related computational issues are discussed, and some simulation results are presented. The procedures developed in the work are applied to several real data sets with some discussion.
Show less - Date Issued
- 2006
- Identifier
- CFE0001099, ucf:46780
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001099
- Title
- Recognition of Complex Events in Open-source Web-scale Videos: Features, Intermediate Representations and their Temporal Interactions.
- Creator
-
Bhattacharya, Subhabrata, Shah, Mubarak, Guha, Ratan, Laviola II, Joseph, Sukthankar, Rahul, Moore, Brian, University of Central Florida
- Abstract / Description
-
Recognition of complex events in consumer uploaded Internet videos, captured under real-world settings, has emerged as a challenging area of research across both computer vision and multimedia community. In this dissertation, we present a systematic decomposition of complex events into hierarchical components and make an in-depth analysis of how existing research are being used to cater to various levels of this hierarchy and identify three key stages where we make novel contributions,...
Show moreRecognition of complex events in consumer uploaded Internet videos, captured under real-world settings, has emerged as a challenging area of research across both computer vision and multimedia community. In this dissertation, we present a systematic decomposition of complex events into hierarchical components and make an in-depth analysis of how existing research are being used to cater to various levels of this hierarchy and identify three key stages where we make novel contributions, keeping complex events in focus. These are listed as follows: (a) Extraction of novel semi-global features -- firstly, we introduce a Lie-algebra based representation of dominant camera motion present while capturing videos and show how this can be used as a complementary feature for video analysis. Secondly, we propose compact clip level descriptors of a video based on covariance of appearance and motion features which we further use in a sparse coding framework to recognize realistic actions and gestures. (b) Construction of intermediate representations -- We propose an efficient probabilistic representation from low-level features computed from videos, basedon Maximum Likelihood Estimates which demonstrates state of the art performancein large scale visual concept detection, and finally, (c) Modeling temporal interactions between intermediate concepts -- Using block Hankel matrices and harmonic analysis of slowly evolving Linear Dynamical Systems, we propose two new discriminative feature spaces for complex event recognition and demonstratesignificantly improved recognition rates over previously proposed approaches.
Show less - Date Issued
- 2013
- Identifier
- CFE0004817, ucf:49724
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004817