VU SoS Lablet Quarterly Executive Summary - July 2019
A. Fundamental Research
The Science of Security for Cyber-Physical Systems (CPS) Lablet focuses on (1) Foundations of CPS Resilience, (2) Analytics for CPS Cybersecurity, (3) Development of a Multi-model Testbed for Simulation–based Evaluation of Resilience, and (4) Mixed Initiative and Collaborative Learning in Adversarial Environments.
- Load forecast systems play a fundamental role the operation in power systems, because they deal with uncertainties in the system's components, e.g., renewable generation, electric vehicles, and smart appliances. However, untrusted sources can introduce vulnerabilities in the system. In this work, we assess the vulnerabilities of load forecast systems and propose a defense mechanism to construct resilient forecasters. We model the strategic interaction between a defender and an attacker as a Stackelberg game, where the defender decides first the prediction scheme and the attacker chooses afterwards its attack strategy. We evaluate our defense approach training forecasters using data from an electric distribution system simulated in GridLAB-D.
- We undertook a pre-test on an independent "text-to-data" extracted from the Tallinn Manual 2.0 on the International Law Applicable to Cyber Operations to validate the method, and verify internal consistency of data extraction & linkages to support metrics and generate output of process from policy texts to text-as-data. This pre-test is on a "simple case" one less complex than the combination of NIST documents we use. Nonetheless, it helped to identify steps to avoid for operational efficiency and provided an opportunity to explore the automation of data extraction and transformation.
- During the reporting period, our testbed efforts were primarily focusing on developing a fully integrated workflow in targeting the smart grid CPS domain. This work had two major goals. (1) A complete set of prediction, attack and detection models have been developed for load forecasting applications. (2) Several building blocks - most notably for gradient-based deep neural network attacks - of these models are generalized to form the basis of a future library reusable components to create SoS experiments involving learning enabled components.
- Our primary work on learning in adversarial environments has been on the brittleness of machine learning (esp. deep learning) algorithms when used for intrusion detection or for the detection of Advanced Persistent Threats. We have addressed the following concerns: (1) Step Size Matters in Deep learning: We showed that choice of step size was critical in determining when deep learning algorithms converged or when they resulted in limit cycles. This has implications for their use in intrusion detection. (2) Cross-Entropy Loss and Low-Rank Features Have Responsibility for Adversarial Examples. State-of-the-art neural networks (esp. those using cross-entropy loss) are vulnerable to adversarial examples; they can easily misclassify inputs that are imperceptibly different than their training and test data. We establish that the use of cross-entropy loss function and the low-rank features of the training data have responsibility for this misclassification.
B. Community Engagement(s)
- Our research was presented in the HotSoS 2019 and also in CPSWeek 2019.
C. Educational Advances
- We offered two summer camps for high-school students and one summer camp for teachers on CPS security based on Roboscape, a collaborative, networked robotics environment that makes key ideas in computer science accessible to groups of learners in informal learning spaces and K12 classrooms.
Groups:
- Architectures
- Modeling
- Resilient Systems
- Simulation
- Approved by NSA
- Human Behavior
- Metrics
- Policy-Governed Secure Collaboration
- Resilient Architectures
- VU
- Analytics for Cyber-Physical System Cybersecurity
- Foundations of a CPS Resilience
- Mixed Initiative and Collaborative Learning in Adversarial Environments
- Multi-model Test Bed for the Simulation-based Evaluation of Resilience
- 2019: July