Current Search: Verification (x)
View All Items
- Title
- EXTENDING DISTRIBUTED TEMPORAL PROTOCOL LOGIC TO A PROOF BASED FRAMEWORK FOR AUTHENTICATION PROTOCOLS.
- Creator
-
Muhammad, Shahabuddin, Guha, Ratan, University of Central Florida
- Abstract / Description
-
Running critical applications, such as e-commerce, in a distributed environment requires assurance of the identities of the participants communicating with each other. Providing such assurance in a distributed environment is a difficult task. The goal of a security protocol is to overcome the vulnerabilities of a distributed environment by providing a secure way to disseminate critical information into the network. However, designing a security protocol is itself an error-prone process. In...
Show moreRunning critical applications, such as e-commerce, in a distributed environment requires assurance of the identities of the participants communicating with each other. Providing such assurance in a distributed environment is a difficult task. The goal of a security protocol is to overcome the vulnerabilities of a distributed environment by providing a secure way to disseminate critical information into the network. However, designing a security protocol is itself an error-prone process. In addition to employing an authentication protocol, one also needs to make sure that the protocol successfully achieves its authentication goals. The Distributed Temporal Protocol Logic (DTPL) provides a language for formalizing both local and global properties of distributed communicating processes. The DTPL can be effectively applied to security protocol analysis as a model checker. Although, a model checker can determine flaws in a security protocol, it can not provide proof of the security properties of a protocol. In this research, we extend the DTPL language and construct a set of axioms by transforming the unified framework of SVO logic into DTPL. This results into a deductive style proof-based framework for the verification of authentication protocols. The proposed framework represents authentication protocols and concisely proves their security properties. We formalize various features essential for achieving authentication, such as message freshness, key association, and source association in our framework. Since analyzing security protocols greatly depends upon associating a received message to its source, we separately analyze the source association axioms, translate them into our framework, and extend the idea for public-key protocols. Developing a proof-based framework in temporal logic gives us another verification tool in addition to the existing model checker. A security property of a protocol can either be verified using our approach, or a design flaw can be identified using the model checker. In this way, we can analyze a security protocol from both perspectives while benefiting from the representation of distributed temporal protocol logic. A challenge-response strategy provides a higher level of abstraction for authentication protocols. Here, we also develop a set of formulae using the challenge-response strategy to analyze a protocol at an abstract level. This abstraction has been adapted from the authentication tests of the graph-theoretic approach of strand space method. First, we represent a protocol in logic and then use the challenge-response strategy to develop authentication tests. These tests help us find the possibility of attacks on authentication protocols by investigating the originator of its received messages. Identifying the unintended originator of a received message indicates the existence of possible flaws in a protocol. We have applied our strategy on several well-known protocols and have successfully identified the attacks.
Show less - Date Issued
- 2007
- Identifier
- CFE0001799, ucf:47281
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001799
- Title
- The Development of Soil Compressibility Prediction Models and Application to Site Settlement.
- Creator
-
Kirts, Scott, Nam, Boo Hyun, Chopra, Manoj, Sallam, Amr, Xanthopoulos, Petros, University of Central Florida
- Abstract / Description
-
The magnitude of the overall settlement depends on several variables such as the Compression Index, Cc, and Recompression Index, Cr, which are determined by a consolidation test; however, the test is time consuming and labor intensive. Correlations have been developed to approximate these compressibility indexes. In this study, a data driven approach has been employed in order to estimate Cc and Cr. Support Vector Machines classification is used to determine the number of distinct models to...
Show moreThe magnitude of the overall settlement depends on several variables such as the Compression Index, Cc, and Recompression Index, Cr, which are determined by a consolidation test; however, the test is time consuming and labor intensive. Correlations have been developed to approximate these compressibility indexes. In this study, a data driven approach has been employed in order to estimate Cc and Cr. Support Vector Machines classification is used to determine the number of distinct models to be developed. The statistical models are built through a forward selection stepwise regression procedure. Ten variables were used, including the moisture content (w), initial void ratio (eo), dry unit weight (?dry), wet unit weight (?wet), automatic hammer SPT blow count (N), overburden stress (?), fines content (-200), liquid limit (LL), plasticity index (PI), and specific gravity (Gs). The results confirm the need for separate models for three out of four soil types, these being Coarse Grained, Fine Grained, and Organic Peat. The models for each classification have varying degrees of accuracy. The correlations were tested through a series of field tests, settlement analysis, and comparison to known site settlement. The first analysis incorporates developed correlations for Cr, and the second utilizes measured Cc and Cr for each soil layer. The predicted settlements from these two analyses were compared to the measured settlement taken in close proximity. Upon conclusion of the analyses, the results indicate that settlement predictions applying a rule of thumb equating Cc to Cr, accounting for elastic settlement, and using a conventional influence zone of settlement, compares more favorably to measured settlement than that of predictions using measured compressibility index(s). Accuracy of settlement predictions is contingent on a thorough field investigation.
Show less - Date Issued
- 2018
- Identifier
- CFE0007208, ucf:52284
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007208
- Title
- Correctness and Progress Verification of Non-Blocking Programs.
- Creator
-
Peterson, Christina, Dechev, Damian, Leavens, Gary, Bassiouni, Mostafa, Cash, Mason, University of Central Florida
- Abstract / Description
-
The progression of multi-core processors has inspired the development of concurrency libraries that guarantee safety and liveness properties of multiprocessor applications. The difficulty of reasoning about safety and liveness properties in a concurrent environment has led to the development of tools to verify that a concurrent data structure meets a correctness condition or progress guarantee. However, these tools possess shortcomings regarding the ability to verify a composition of data...
Show moreThe progression of multi-core processors has inspired the development of concurrency libraries that guarantee safety and liveness properties of multiprocessor applications. The difficulty of reasoning about safety and liveness properties in a concurrent environment has led to the development of tools to verify that a concurrent data structure meets a correctness condition or progress guarantee. However, these tools possess shortcomings regarding the ability to verify a composition of data structure operations. Additionally, verification techniques for transactional memory evaluate correctness based on low-level read/write histories, which is not applicable to transactional data structures that use a high-level semantic conflict detection.In my dissertation, I present tools for checking the correctness of multiprocessor programs that overcome the limitations of previous correctness verification techniques. Correctness Condition Specification (CCSpec) is the first tool that automatically checks the correctness of a composition of concurrent multi-container operations performed in a non-atomic manner. Transactional Correctness tool for Abstract Data Types (TxC-ADT) is the first tool that can check the correctness of transactional data structures. TxC-ADT elevates the standard definitions of transactional correctness to be in terms of an abstract data type, an essential aspect for checking correctness of transactions that synchronize only for high-level semantic conflicts. Many practical concurrent data structures, transactional data structures, and algorithms to facilitate non-blocking programming all incorporate helping schemes to ensure that an operation comprising multiple atomic steps is completed according to the progress guarantee. The helping scheme introduces additional interference by the active threads in the system to achieve the designed progress guarantee. Previous progress verification techniques do not accommodate loops whose termination is dependent on complex behaviors of the interfering threads, making these approaches unsuitable. My dissertation presents the first progress verification technique for non-blocking algorithms that are dependent on descriptor-based helping mechanisms.
Show less - Date Issued
- 2019
- Identifier
- CFE0007705, ucf:52433
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007705
- Title
- Learning Hierarchical Representations for Video Analysis Using Deep Learning.
- Creator
-
Yang, Yang, Shah, Mubarak, Sukthankar, Gita, Da Vitoria Lobo, Niels, Stanley, Kenneth, Sukthankar, Rahul, University of Central Florida
- Abstract / Description
-
With the exponential growth of the digital data, video content analysis (e.g., action, event recognition) has been drawing increasing attention from computer vision researchers. Effective modeling of the objects, scenes, and motions is critical for visual understanding. Recently there has been a growing interest in the bio-inspired deep learning models, which has shown impressive results in speech and object recognition. The deep learning models are formed by the composition of multiple non...
Show moreWith the exponential growth of the digital data, video content analysis (e.g., action, event recognition) has been drawing increasing attention from computer vision researchers. Effective modeling of the objects, scenes, and motions is critical for visual understanding. Recently there has been a growing interest in the bio-inspired deep learning models, which has shown impressive results in speech and object recognition. The deep learning models are formed by the composition of multiple non-linear transformations of the data, with the goal of yielding more abstract and ultimately more useful representations. The advantages of the deep models are three fold: 1) They learn the features directly from the raw signal in contrast to the hand-designed features. 2) The learning can be unsupervised, which is suitable for large data where labeling all the data is expensive and unpractical. 3) They learn a hierarchy of features one level at a time and the layerwise stacking of feature extraction, this often yields better representations.However, not many deep learning models have been proposed to solve the problems in video analysis, especially videos ``in a wild''. Most of them are either dealing with simple datasets, or limited to the low-level local spatial-temporal feature descriptors for action recognition. Moreover, as the learning algorithms are unsupervised, the learned features preserve generative properties rather than the discriminative ones which are more favorable in the classification tasks. In this context, the thesis makes two major contributions.First, we propose several formulations and extensions of deep learning methods which learn hierarchical representations for three challenging video analysis tasks, including complex event recognition, object detection in videos and measuring action similarity. The proposed methods are extensively demonstrated for each work on the state-of-the-art challenging datasets. Besides learning the low-level local features, higher level representations are further designed to be learned in the context of applications. The data-driven concept representations and sparse representation of the events are learned for complex event recognition; the representations for object body parts and structures are learned for object detection in videos; and the relational motion features and similarity metrics between video pairs are learned simultaneously for action verification.Second, in order to learn discriminative and compact features, we propose a new feature learning method using a deep neural network based on auto encoders. It differs from the existing unsupervised feature learning methods in two ways: first it optimizes both discriminative and generative properties of the features simultaneously, which gives our features a better discriminative ability. Second, our learned features are more compact, while the unsupervised feature learning methods usually learn a redundant set of over-complete features. Extensive experiments with quantitative and qualitative results on the tasks of human detection and action verification demonstrate the superiority of our proposed models.
Show less - Date Issued
- 2013
- Identifier
- CFE0004964, ucf:49593
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004964
- Title
- CONNECTING SELF ENHANCEMENT AND SELF VERIFICATION MESSAGES IN FRIENDSHIPS.
- Creator
-
Bloch, Ann, Weger, Harry, University of Central Florida
- Abstract / Description
-
This study investigates the connection between self-enhancement and self-verification and confirmation and emotional support. The hypotheses predicted that there is a positive relationship between confirmation and self-enhancement and self-verification; people feel good about themselves when confirmed by friends, people feel that friends know them well when they are confirmed. The hypotheses also predicted that there would be a positive relationship between emotional support and self...
Show moreThis study investigates the connection between self-enhancement and self-verification and confirmation and emotional support. The hypotheses predicted that there is a positive relationship between confirmation and self-enhancement and self-verification; people feel good about themselves when confirmed by friends, people feel that friends know them well when they are confirmed. The hypotheses also predicted that there would be a positive relationship between emotional support and self-enhancement and self-verification; people feel good when friends provide emotional support, and people feel that friends know them well when provided emotional support. A research question was also posed: Does family functioning have an effect on perceptions of self-enhancement and self-verification messages? To find the answers, a questionnaire was completed by 279 individuals. The results indicate two types of enhancement messages; a more specific and positive form of enhancement and more global (and negative) self perception of rejection. The findings are interesting and unique to self-enhancement in communication research which provides many avenues for continued research. Results also suggest that different elements of confirming communication influences perceptions of enhancement in different ways, emotional support predicts verification.
Show less - Date Issued
- 2009
- Identifier
- CFE0002939, ucf:47975
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0002939
- Title
- Verification and Automated Synthesis of Memristor Crossbars.
- Creator
-
Pourtabatabaie, Arya, Jha, Sumit Kumar, Chatterjee, Mainak, Pattanaik, Sumanta, University of Central Florida
- Abstract / Description
-
The Memristor is a newly synthesized circuit element correlating differences in electrical charge and magnetic flux, which effectively acts as a nonlinear resistor with memory. The small size of this element and its potential for passive state preservation has opened great opportunities for data-level parallel computation, since the functions of memory and processing can be realized on the same physical device.In this research we present an in-depth study of memristor crossbars for...
Show moreThe Memristor is a newly synthesized circuit element correlating differences in electrical charge and magnetic flux, which effectively acts as a nonlinear resistor with memory. The small size of this element and its potential for passive state preservation has opened great opportunities for data-level parallel computation, since the functions of memory and processing can be realized on the same physical device.In this research we present an in-depth study of memristor crossbars for combinational and sequential logic. We outline the structure of formulas which they are able to produce and henceforth the inherent powers and limitations of Memristive Crossbar Computing.As an improvement on previous methods of automated crossbar synthesis, a method for symbolically verifying crossbars is proposed, proven and analysed.
Show less - Date Issued
- 2016
- Identifier
- CFE0006840, ucf:51765
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006840
- Title
- FORMALIZATION OF INPUT AND OUTPUT IN MODERN OPERATING SYSTEMS: THE HADLEY MODEL.
- Creator
-
Gerber, Matthew, Leeson, John, University of Central Florida
- Abstract / Description
-
We present the Hadley model, a formal descriptive model of input and output for modern computer operating systems. Our model is intentionally inspired by the Open Systems Interconnection model of networking; I/O as a process is defined as a set of translations between a set of computer-sensible forms, or layers, of information. To illustrate an initial application domain, we discuss the utility of the Hadley model and a potential associated I/O system as a tool for digital forensic...
Show moreWe present the Hadley model, a formal descriptive model of input and output for modern computer operating systems. Our model is intentionally inspired by the Open Systems Interconnection model of networking; I/O as a process is defined as a set of translations between a set of computer-sensible forms, or layers, of information. To illustrate an initial application domain, we discuss the utility of the Hadley model and a potential associated I/O system as a tool for digital forensic investigators. To illustrate practical uses of the Hadley model we present the Hadley Specification Language, an essentially functional language designed to allow the translations that comprise I/O to be written in a concise format allowing for relatively easy verifiability. To further illustrate the utility of the language we present a read/write Microsoft DOS FAT12 and read-only Linux ext2 file system specification written in the new format. We prove the correctness of the read-only side of these descriptions. We present test results from operation of our HSL-driven system both in user mode on stored disk images and as part of a Linux kernel module allowing file systems to be read. We conclude by discussing future directions for the research.
Show less - Date Issued
- 2005
- Identifier
- CFE0000392, ucf:46339
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000392
- Title
- Advancing Practical Specification Techniques for Modern Software Systems.
- Creator
-
Singleton, John, Leavens, Gary, Jha, Sumit Kumar, Zou, Changchun, Hughes, Charles, Brennan, Joseph, University of Central Florida
- Abstract / Description
-
The pervasive nature of software (and the tendency for it to contain errors) has long been a concern of theoretical computer scientists. Many investigators have endeavored to produce theories, tools, and techniques for verifying the behavior of software systems. One of the most promising lines of research is that of formal specification, which is a subset of the larger field of formal methods. In formal specification, one composes a precise mathematical description of a software system and...
Show moreThe pervasive nature of software (and the tendency for it to contain errors) has long been a concern of theoretical computer scientists. Many investigators have endeavored to produce theories, tools, and techniques for verifying the behavior of software systems. One of the most promising lines of research is that of formal specification, which is a subset of the larger field of formal methods. In formal specification, one composes a precise mathematical description of a software system and uses tools and techniques to ensure that the software that has been written conforms to this specification. Examples of such systems are Z notation, the Java Modeling Language, and many others. However, a fundamental problem that plagues this line of research is that the specifications themselves are often costly to produce and difficult to reuse. If the field of formal specification is to advance, we must develop sound techniques for reducing the cost of producing and reusing software specifications. The work presented in this dissertation lays out a path to producing sophisticated, automated tools for inferring large, complex code bases, tools for allowing engineers to share and reuse specifications, and specification languages for specifying information flow policies that can be written separately from program code. This dissertation introduces three main lines of research. First, I discuss a system that facilitates the authoring, sharing, and reuse of software specifications. Next, I discuss a technique which aims to reduce the cost of producing specifications by automatically inferring them. Finally, I discuss a specification language called Evidently which aims to make information flow security policies easier to write, maintain, and enforce by untangling them from the code to which they are applied.
Show less - Date Issued
- 2018
- Identifier
- CFE0007099, ucf:51953
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007099
- Title
- INCREMENTAL LIFECYCLE VALIDATION OF KNOWLEDGE-BASED SYSTEMS THROUGH COMMONKADS.
- Creator
-
Batarseh, Feras, Gonzalez, Avelino, University of Central Florida
- Abstract / Description
-
This dissertation introduces a novel validation method for knowledge-based systems (KBS).Validation is an essential phase in the development lifecycle of knowledge-based systems. Validation ensures that the system is valid, reliable and that it reflects the knowledge of the expert and meets the specifications. Although many validation methods have been introduced for knowledge-based systems, there is still a need for an incremental validation method based on a lifecycle model. Lifecycle...
Show moreThis dissertation introduces a novel validation method for knowledge-based systems (KBS).Validation is an essential phase in the development lifecycle of knowledge-based systems. Validation ensures that the system is valid, reliable and that it reflects the knowledge of the expert and meets the specifications. Although many validation methods have been introduced for knowledge-based systems, there is still a need for an incremental validation method based on a lifecycle model. Lifecycle models provide a general framework for the developer and a mapping technique from the system into the validation process. They support reusability, modularity and offer guidelines for knowledge engineers to achieve high quality systems. CommonKADS is a set of models that helps to represent and analyze knowledge-based systems. It offers a de facto standard for building knowledge-based systems. Additionally, CommonKADS is a knowledge representation-independent model. It has powerful models that can represent many domains. Defining an incremental validation method based on a conceptual lifecycle model (such as CommonKADS) has a number of advantages such as reducing time and effort, ease of implementation when having a template to follow, well-structured design, and better tracking of errors when they occur. Moreover, the validation method introduced in this dissertation is based on case testing and selecting an appropriate set of test cases to validate the system. The validation method defined makes use of results of prior test cases in an incremental validation procedure. This facilitates defining a minimal set of test cases that provides complete and effective system coverage. CommonKADS doesn't define validation, verification or testing in any of its models. This research seeks to establish a direct relation between validation and lifecycle models, and introduces a validation method for KBS embedded into CommonKADS.
Show less - Date Issued
- 2011
- Identifier
- CFE0003621, ucf:48879
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0003621
- Title
- Reasoning Tradeoffs in Implicit Invocation and Aspect Oriented Languages.
- Creator
-
Sanchez Salazar, Jose, Leavens, Gary, Turgut, Damla, Jha, Sumit, Martin, Heath, University of Central Florida
- Abstract / Description
-
To reason about a program means to state or conclude, by logical means, some properties the program exhibits; like its correctness according to certain expected behavior. The continuous need for more ambitious, more complex, and more dependable software systems demands for better mechanisms to modularize them and reason about their correctness. The reasoning process is affected by the design decisions made by the developer of the program and by the features supported by the programming...
Show moreTo reason about a program means to state or conclude, by logical means, some properties the program exhibits; like its correctness according to certain expected behavior. The continuous need for more ambitious, more complex, and more dependable software systems demands for better mechanisms to modularize them and reason about their correctness. The reasoning process is affected by the design decisions made by the developer of the program and by the features supported by the programming language used. Beyond Object Orientation, Implicit Invocation and Aspect Oriented languages pose very hard reasoning challenges. Important tradeoffs must be considered while reasoning about a program: modular vs. non-modular reasoning, case-by-case analysis vs. abstraction, explicitness vs. implicitness; are some of them. By deciding a series of tradeoffs one can configure a reasoning scenario. For example if one decides for modular reasoning and explicit invocation a well-known object oriented reasoning scenario can be used.This dissertation identifies various important tradeoffs faced when reasoning about implicit invocation and aspect oriented programs, characterize scenarios derived from making choices regarding these tradeoffs, and provides sound proof rules for verification of programs covered by all these scenarios. Guidance for program developers and language designers is also given, so that reasoning about these types of programs becomes more tractable.
Show less - Date Issued
- 2015
- Identifier
- CFE0005706, ucf:50133
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005706
- Title
- Security of Autonomous Systems under Physical Attacks: With application to Self-Driving Cars.
- Creator
-
Dutta, Raj, Jin, Yier, Sundaram, Kalpathy, DeMara, Ronald, Zhang, Shaojie, Zhang, Teng, University of Central Florida
- Abstract / Description
-
The drive to achieve trustworthy autonomous cyber-physical systems (CPS), which can attain goals independently in the presence of significant uncertainties and for long periods of time without any human intervention, has always been enticing. Significant progress has been made in the avenues of both software and hardware for fulfilling these objectives. However, technological challenges still exist and particularly in terms of decision making under uncertainty. In an autonomous system,...
Show moreThe drive to achieve trustworthy autonomous cyber-physical systems (CPS), which can attain goals independently in the presence of significant uncertainties and for long periods of time without any human intervention, has always been enticing. Significant progress has been made in the avenues of both software and hardware for fulfilling these objectives. However, technological challenges still exist and particularly in terms of decision making under uncertainty. In an autonomous system, uncertainties can arise from the operating environment, adversarial attacks, and from within the system. As a result of these concerns, human-beings lack trust in these systems and hesitate to use them for day-to-day use.In this dissertation, we develop algorithms to enhance trust by mitigating physical attacks targeting the integrity and security of sensing units of autonomous CPS. The sensors of these systems are responsible for gathering data of the physical processes. Lack of measures for securing their information can enable malicious attackers to cause life-threatening situations. This serves as a motivation for developing attack resilient solutions.Among various security solutions, attention has been recently paid toward developing system-level countermeasures for CPS whose sensor measurements are corrupted by an attacker. Our methods are along this direction as we develop an active and multiple passive algorithm to detect the attack and minimize its effect on the internal state estimates of the system. In the active approach, we leverage a challenge authentication technique for detection of two types of attacks: The Denial of Service (DoS) and the delay injection on active sensors of the systems. Furthermore, we develop a recursive least square estimator for recovery of system from attacks. The majority of the dissertation focuses on designing passive approaches for sensor attacks. In the first method, we focus on a linear stochastic system with multiple sensors, where measurements are fused in a central unit to estimate the state of the CPS. By leveraging Bayesian interpretation of the Kalman filter and combining it with the Chi-Squared detector, we recursively estimate states within an error bound and detect the DoS and False Data Injection attacks. We also analyze the asymptotic performance of the estimator and provide conditions for resilience of the state estimate.Next, we propose a novel distributed estimator based on l1 norm optimization, which could recursively estimate states within an error bound without restricting the number of agents of the distributed system that can be compromised. We also extend this estimator to a vehicle platoon scenario which is subjected to sparse attacks. Furthermore, we analyze the resiliency and asymptotic properties of both the estimators. Finally, at the end of the dissertation, we make an initial effort to formally verify the control system of the autonomous CPS using the statistical model checking method. It is done to ensure that a real-time and resource constrained system such as a self-driving car, with controllers and security solutions, adheres to strict timing constrains.
Show less - Date Issued
- 2018
- Identifier
- CFE0007174, ucf:52253
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007174
- Title
- Techniques for automated parameter estimation in computational models of probabilistic systems.
- Creator
-
Hussain, Faraz, Jha, Sumit, Leavens, Gary, Turgut, Damla, Uddin, Nizam, University of Central Florida
- Abstract / Description
-
The main contribution of this dissertation is the design of two new algorithms for automatically synthesizing values of numerical parameters of computational models of complexstochastic systems such that the resultant model meets user-specified behavioral specifications.These algorithms are designed to operate on probabilistic systems (-) systems that, in general,behave differently under identical conditions. The algorithms work using an approach thatcombines formal verification and...
Show moreThe main contribution of this dissertation is the design of two new algorithms for automatically synthesizing values of numerical parameters of computational models of complexstochastic systems such that the resultant model meets user-specified behavioral specifications.These algorithms are designed to operate on probabilistic systems (-) systems that, in general,behave differently under identical conditions. The algorithms work using an approach thatcombines formal verification and mathematical optimization to explore a model's parameterspace.The problem of determining whether a model instantiated with a given set of parametervalues satisfies the desired specification is first defined using formal verification terminology,and then reformulated in terms of statistical hypothesis testing. Parameter space explorationinvolves determining the outcome of the hypothesis testing query for each parameter pointand is guided using simulated annealing. The first algorithm uses the sequential probabilityratio test (SPRT) to solve the hypothesis testing problems, whereas the second algorithmuses an approach based on Bayesian statistical model checking (BSMC).The SPRT-based parameter synthesis algorithm was used to validate that a given model ofglucose-insulin metabolism has the capability of representing diabetic behavior by synthesizingvalues of three parameters that ensure that the glucose-insulin subsystem spends at least 20minutes in a diabetic scenario. The BSMC-based algorithm was used to discover the valuesof parameters in a physiological model of the acute inflammatory response that guarantee aset of desired clinical outcomes.These two applications demonstrate how our algorithms use formal verification, statisticalhypothesis testing and mathematical optimization to automatically synthesize parameters ofcomplex probabilistic models in order to meet user-specified behavioral properties
Show less - Date Issued
- 2016
- Identifier
- CFE0006117, ucf:51200
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0006117
- Title
- VERIFICATION OF PILOT-SCALE IRON RELEASE MODELS.
- Creator
-
Glatthorn, Stephen, Taylor, James, University of Central Florida
- Abstract / Description
-
A model for the prediction of color release from a pilot distribution system was created in 2003 by Imran. This model allows prediction of the release of color from aged cast iron and galvanized steel pipes as a function of water quality and hydraulic residence time. Color was used as a surrogate measurement for iron, which exhibited a strong linear correlation. An anomaly of this model was an absence of a term to account for pH, due to the influent water being well stabilized. A new study...
Show moreA model for the prediction of color release from a pilot distribution system was created in 2003 by Imran. This model allows prediction of the release of color from aged cast iron and galvanized steel pipes as a function of water quality and hydraulic residence time. Color was used as a surrogate measurement for iron, which exhibited a strong linear correlation. An anomaly of this model was an absence of a term to account for pH, due to the influent water being well stabilized. A new study was completed to evaluate the effectiveness of corrosion inhibitors against traditional adjustment. Two control lines were supplied with nearly same water qualities, one at pH close to pHs and one at pH well above pHs. The resulting data showed that effluent iron values were typically greater in the line with lower pH. The non-linear color model by Imran shows good agreement when the LSI was largely positive, but underpredicted the color release from the lower LSI line. A modification to the Larson Ratio proposed by Imran was able to give a reasonable agreement to the data at lower LSI values. LSI showed no definite relation to iron release, although a visual trend of higher LSI mitigating iron release can be seen. An iron flux model was also developed on the same pilot system by Mutoti. This model was based on a steady state mass balance of iron in a pipe. The constants for the model were empirically derived from experiments at different hydraulic conditions with a constant water quality. Experiments were assumed to reach steady state at 3 pipe volumes due to the near constant effluent turbidity achieved at this point. The model proposes that the iron flux under laminar flow conditions is constant, while the iron flux is linearly related to the Reynolds Number under turbulent conditions. This model incorporates the color release models developed by Imran to calculate flux values from different water qualities. A limited number of experiments were performed in the current study using desalinated and ground water sources at Reynolds Numbers ranging from 50 to 200. The results of these limited experiments showed that the iron flux for cast iron pipe was approximately one-half of the predicted values from Mutoti. This discrepancy may be caused by the more extensive flushing of the pipes performed on the current experiments which allowed attainment of a true steady state. Model changes were proposed to distinguish between near stagnant flow and the upper laminar region, with the upper laminar region showing a slight linear increase. Predictions using the galvanized flux model were not accurate due to an inferior color release model that was developed for galvanized pipes. The model exhibits a high dependence on sulfate concentrations, but concentrations of sulfates in the current experiments were low. This led to low predicted flux values when the actual data showed otherwise. A new galvanized model was developed from a combination of data from the original and current experiments. The predicted flux values using the new model showed great improvement over the old model, but the new model database was limited and the resulting model was not able to be independently tested.
Show less - Date Issued
- 2007
- Identifier
- CFE0001704, ucf:47332
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0001704
- Title
- reasoning about frame properties in object-oriented programs.
- Creator
-
Bao, Yuyan, Leavens, Gary, Dechev, Damian, Jha, Sumit Kumar, Cash, Mason, University of Central Florida
- Abstract / Description
-
Framing is important for specification and verification of object-oriented programs.This dissertation develops the local reasoning approach for framing in the presence of data structures with unrestricted sharing and subtyping.It can verify shared data structures specified in a concise way by unifying fine-grained region logic and separation logic. Then the fine-grained region logic is extended to reason about subtyping.First, fine-grained region logic is adapted from region logic to express...
Show moreFraming is important for specification and verification of object-oriented programs.This dissertation develops the local reasoning approach for framing in the presence of data structures with unrestricted sharing and subtyping.It can verify shared data structures specified in a concise way by unifying fine-grained region logic and separation logic. Then the fine-grained region logic is extended to reason about subtyping.First, fine-grained region logic is adapted from region logic to express regions at the granularity of individual fields. Conditional region expressions are introduced; not only does this allow one to specify more precise frame conditions, it also has the ability to express footprints of separation logic assertions.Second, fine-grained region logic is generalized to a new logic called unified fine-grained region logic by allowing the logic to restrict the heap in which a program runs. This feature allows one to express specifications in separation logic.Third, both fine-grained region logic and separation logic can be encoded to unified fine-grained region logic. This result allows the proof system to reason about programs specified in both styles.Finally, fine-grained region logic is extended to reason about a programming language that is similar to Java. To reason about inheritance locally, a frame condition for behavioral subtyping is defined and proved sound.
Show less - Date Issued
- 2017
- Identifier
- CFE0007279, ucf:52195
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007279