You are here
Security of Autonomous Systems under Physical Attacks: With application to Self-Driving Cars
- Date Issued:
- 2018
- Abstract/Description:
- The drive to achieve trustworthy autonomous cyber-physical systems (CPS), which can attain goals independently in the presence of significant uncertainties and for long periods of time without any human intervention, has always been enticing. Significant progress has been made in the avenues of both software and hardware for fulfilling these objectives. However, technological challenges still exist and particularly in terms of decision making under uncertainty. In an autonomous system, uncertainties can arise from the operating environment, adversarial attacks, and from within the system. As a result of these concerns, human-beings lack trust in these systems and hesitate to use them for day-to-day use.In this dissertation, we develop algorithms to enhance trust by mitigating physical attacks targeting the integrity and security of sensing units of autonomous CPS. The sensors of these systems are responsible for gathering data of the physical processes. Lack of measures for securing their information can enable malicious attackers to cause life-threatening situations. This serves as a motivation for developing attack resilient solutions.Among various security solutions, attention has been recently paid toward developing system-level countermeasures for CPS whose sensor measurements are corrupted by an attacker. Our methods are along this direction as we develop an active and multiple passive algorithm to detect the attack and minimize its effect on the internal state estimates of the system. In the active approach, we leverage a challenge authentication technique for detection of two types of attacks: The Denial of Service (DoS) and the delay injection on active sensors of the systems. Furthermore, we develop a recursive least square estimator for recovery of system from attacks. The majority of the dissertation focuses on designing passive approaches for sensor attacks. In the first method, we focus on a linear stochastic system with multiple sensors, where measurements are fused in a central unit to estimate the state of the CPS. By leveraging Bayesian interpretation of the Kalman filter and combining it with the Chi-Squared detector, we recursively estimate states within an error bound and detect the DoS and False Data Injection attacks. We also analyze the asymptotic performance of the estimator and provide conditions for resilience of the state estimate.Next, we propose a novel distributed estimator based on l1 norm optimization, which could recursively estimate states within an error bound without restricting the number of agents of the distributed system that can be compromised. We also extend this estimator to a vehicle platoon scenario which is subjected to sparse attacks. Furthermore, we analyze the resiliency and asymptotic properties of both the estimators. Finally, at the end of the dissertation, we make an initial effort to formally verify the control system of the autonomous CPS using the statistical model checking method. It is done to ensure that a real-time and resource constrained system such as a self-driving car, with controllers and security solutions, adheres to strict timing constrains.
Title: | Security of Autonomous Systems under Physical Attacks: With application to Self-Driving Cars. |
44 views
17 downloads |
---|---|---|
Name(s): |
Dutta, Raj, Author Jin, Yier, Committee Chair Sundaram, Kalpathy, Committee CoChair DeMara, Ronald, Committee Member Zhang, Shaojie, Committee Member Zhang, Teng, Committee Member University of Central Florida, Degree Grantor |
|
Type of Resource: | text | |
Date Issued: | 2018 | |
Publisher: | University of Central Florida | |
Language(s): | English | |
Abstract/Description: | The drive to achieve trustworthy autonomous cyber-physical systems (CPS), which can attain goals independently in the presence of significant uncertainties and for long periods of time without any human intervention, has always been enticing. Significant progress has been made in the avenues of both software and hardware for fulfilling these objectives. However, technological challenges still exist and particularly in terms of decision making under uncertainty. In an autonomous system, uncertainties can arise from the operating environment, adversarial attacks, and from within the system. As a result of these concerns, human-beings lack trust in these systems and hesitate to use them for day-to-day use.In this dissertation, we develop algorithms to enhance trust by mitigating physical attacks targeting the integrity and security of sensing units of autonomous CPS. The sensors of these systems are responsible for gathering data of the physical processes. Lack of measures for securing their information can enable malicious attackers to cause life-threatening situations. This serves as a motivation for developing attack resilient solutions.Among various security solutions, attention has been recently paid toward developing system-level countermeasures for CPS whose sensor measurements are corrupted by an attacker. Our methods are along this direction as we develop an active and multiple passive algorithm to detect the attack and minimize its effect on the internal state estimates of the system. In the active approach, we leverage a challenge authentication technique for detection of two types of attacks: The Denial of Service (DoS) and the delay injection on active sensors of the systems. Furthermore, we develop a recursive least square estimator for recovery of system from attacks. The majority of the dissertation focuses on designing passive approaches for sensor attacks. In the first method, we focus on a linear stochastic system with multiple sensors, where measurements are fused in a central unit to estimate the state of the CPS. By leveraging Bayesian interpretation of the Kalman filter and combining it with the Chi-Squared detector, we recursively estimate states within an error bound and detect the DoS and False Data Injection attacks. We also analyze the asymptotic performance of the estimator and provide conditions for resilience of the state estimate.Next, we propose a novel distributed estimator based on l1 norm optimization, which could recursively estimate states within an error bound without restricting the number of agents of the distributed system that can be compromised. We also extend this estimator to a vehicle platoon scenario which is subjected to sparse attacks. Furthermore, we analyze the resiliency and asymptotic properties of both the estimators. Finally, at the end of the dissertation, we make an initial effort to formally verify the control system of the autonomous CPS using the statistical model checking method. It is done to ensure that a real-time and resource constrained system such as a self-driving car, with controllers and security solutions, adheres to strict timing constrains. | |
Identifier: | CFE0007174 (IID), ucf:52253 (fedora) | |
Note(s): |
2018-08-01 Ph.D. Engineering and Computer Science, Electrical Engineering and Computer Engineering Doctoral This record was generated from author submitted information. |
|
Subject(s): | Cyber-Physical Systems Security -- Estimation Theory -- Modeling -- Formal Verification -- Distributed Systems -- Self-Driving Cars -- Optimization | |
Persistent Link to This Record: | http://purl.flvc.org/ucf/fd/CFE0007174 | |
Restrictions on Access: | public 2018-08-15 | |
Host Institution: | UCF |