Policy Analytics for Cybersecurity of Cyber-Physical Systems: October 2022 (Y5, Q2)
Funding Type: Full proposal
Start Date: March 01, 2018
Expected Completion Date: April 30, 2023
Principal Investigator: Nazli Choucri
Public View
Accomplishments
Accomplishments during this reporting period: July 2022 - October 2022 (Year 5: Quarter 2) are summarized in this report.
Table of Contents
1. Coordinates of Design
2. Policy-data for Cyber-Physical Systems
3. Anchors for Analytics Applications
4. Challenges - Unexpected
This Project focuses on Policy Analytics for Cybersecurity of Cyber-Physical Systems. The research design requires attention to policy, to cyber- physical system, and to the connections between them – direct or indirect as required.
- The hard problem is policy governed secure collaboration.
- The purpose is to develop empirical methods to reduce barriers to implementation of cybersecurity policies.
- The focus is on analytics and applications.
- The approach is data-based and multimethod. The broader context is the divers, complex, and dense ecosystem cybersecurity-related policies and issuances for US Department of Defense, as signaled in the previous Quarterly Report and earlier.
The Y5 - Q2 Report highlights core research practice and procedure. We report on our audit-of-analytics for the entire project to this point– at each stage of process and all aspects of product.
The purpose of is to provide transparency and full disclosure of process for all parts of the research design related to data compilation, computation, implementation, validation, and evaluation. We view the phases of this audit as imperatives.
By way of providing a familiar context for the audit of analytics in Y2 – Q2 Report, we turn to Figure 1 below (shown in earlier Quarterly Reports) to show the align the Hard Problems across the phases of the Project research design.
FIGURE 1: ALIGNING HARD PROBLEMS and RESEARCH PHASES
The top segment of Figure 1 shows the research design at a high level of aggregation. The bottom segment situates the hard problem of Policy Governed Secure Collaboration along the top grey bar. The other grey bars align the other Hard Problems across different phases of the research design.
The audit-of-analytics system consists of four key segments, or imperatives, for the conduct of robust and replicable research as follows:
- Coordinate of Design
- Policy Data for Cybersecurity
- Anchors for Analytics
- Challenges - Unexpected
The Y5 - Q2 Report addresses each of these imperatives and their constitutent elements.
1. Coordinate of Design
The coordinates of the Project research design consist of four general properties plus one that carries some particular dilemmas. Each property a is framed in comparative and binary idiom -- that is, (a) vs. (b). The purpose here is to clarify the relative salience of (a) vs. (b).
1.1 Theory vs. Data
This Project is framed almost entirely by data. In fact, “data” itself is a central, if not core, focus. At the same time, we found it necessary, even essential, to focus on theory in two distinct ways, namely, (a) theory for data base construction and (b) theory for analysis of constructed data base.
1.2 Data vs. Metrics
This Project addresses a range of challenges in the process of identifying and collecting the relevant “raw” policy materials in printed text. The challenge is to transform text data into organized structure to convert into for metrics, while at the same time:
- Retaining the integrity of the source content, and
- Providing means for metric verification and validation.
The Quarterly Reports cover in considerable details both logic and practice at each phase of Project.
1.3 Metrics vs. Methods
Metrics are based on organized structure derived from the “raw data”. Effective use of metrics depends on methods, applications, replications, and validation; and the – all on the assumption metrics accuracy. In this Project, we differentiate among methods by context, as follows:
- Method in the research design
- Method for collection of data,
- Method in generation of metrics, and
- Method for analysis of metrics.
Each of (i) – (iv) is based on different criteria.
1.4 Methods vs. Models
At this point methods refer to “tools” used to generate metrics (per the above). Once completed, metrics are basic inputs for construction and uses of models. Overall, this project drawn largely on three modes of method (i) design structure matrix, (ii) inputs for network analysis, (iii) “by hand” deep-description linkages of content across distributed policy documents; and (iv) deep-description linkages of policy elements to properties of cyber-physical system
1.5 Process vs. Products
The dilemma here is that, for some purposes, process and products can be merged, but must for other purposes these must remain separate This distinction is not articulated nor is it anticipated in the research design, rather it evolved as a pragmatic matter.
2. Policy-Data for Cyber-Physical Systems
The second audit imperative pertains to the text form of the “raw data”, and identifies five operational properties that define the operational research initiative.
2.1 Policy System vs. Cyber-Physical System
The data alignment challenge is about establishing connections between (a) policy directive, (b) cyber-physical system and (c) linkages between (a) and (b). Given that the challenge is to bring policy to bear on cyber-physical systems, we recognize that we have devoted considerably more attention to representing the properties of the cyber-physical system than to identifying and articulating the logic of policy. We recognized, much later in the project than we should have given considerably more attention to the complexity of the policy domain and considerably less to the cyber-physical system test case.
2.2 Conceptual vs. Empirical
A related feature of the second imperative is to the distinction between conceptual and empirical source of data as reported in NIST 7628. The research challenge is to proceed with the understanding that the conceptual is created at the source and serves as “raw data” to represent the cyber-physical system. In this context what is considered fundamental to system operations has already been defined by the NIST 7628 source data.
2.3 Empirics vs. Structure
The value-added at this point is defined by the successful construction of structure for the cyber-physical system, based on its properties framed in empirical terms.
2.4 Structure vs. Implementation
Implementation here refers to the value added or net utility of system structure for the overall research design. On the one hand, we have constructed data and metrics based on information in NIST 7628 conceptual representation of smart grid cyber-physical system. On the other hand, the full utility of such representation is contingent on applications of direct policy interventions to the system itself. We have reported on full results of one part of this logic, and not on results for the second part.
2.5 Implementation vs. Validation
The implementation and validation in our research design carry a two-fold challenge:
- implementation for cyber-physical system and logics, as well as validation can be achieved by effective application of methods to other independent data bases.
- implementation for policy and directives, by contrast, as well as validation remain contingent on completion of the policy-cyber-physical system linkages.
We have validated item (i) and cannot yet implement or validate item (ii).
3. Anchors for Analytics Applications
The third segment of the audit system audit focuses on set of research components. These are termed “anchors” as they can be seen as “stand alone” segments designed to support both logic and process for the overall Project.
3.1 System of Policies vs. System of Operations
The most pervasive feature of the Project research design is captured by the differences between (a) policies and directives (c) cyber-physical operations, and (c) the linkage mechanisms that relate (a) and (b) – to enable application, implementation, and validation. This issue was signaled repeatedly in previous Quarterly Reports. The implications of this difference for implementation and validation are addressed above.
3.2 Whole vs. Partial Design
Here we define “whole” as a stand-alone segment of the research design, and “partial” as
a segment within a part of the “whole”. This distinction helps to identify and situate the functions of aggregation and disaggregation used in research design.
3.3 Parsing "Pieces"
Connected, but not identical to the above, is parsing defined here in the context of the first audit imperative, namely coordinates of design. This element is pervasive throughout and greatly assist researchers to identify or determine key features of individual coordinates situation and articulation.
3.4 System(s) vs. Network(s)
While these terms can commonly be seen as mirror images, identical, or overlapping, here we draw a distinct that is a defining feature of the overall Project and its design. As reported in earlier Quarterly Reports, network representations and analyses of the cyber-physical system test case provided results that would help situate vulnerabilities, salience, and impacts.
3.5 Metrics vs. Statistics
Early on in this audit, we addressed the matter of metrics. Here we highlight our use of the common understanding of statistics as a method applied to metrics. The metrics signaled here represent the cyber-physical system test-case and the statistical analysis focuses on the system-wide properties of the test-case.
3.6 Statics vs. Dynamics
The Project research designed focus on policies and systems as is. While we recognize embedded dynamics, our investigations highlighted these in one of the cases we used for validation of our network analysis demonstrates “feedback dynamics”, a hidden feature embedded in system properties and apparent only evident only with deep network analysis.
4. Challenges—Unexpected
The last part of this audit-of-analytics focuses on the unexpected issues or dilemmas that were not anticipated in the initial research design or in the first round of investigations. It turned out that there were actually defining features of the very “reality” that we set out to examine. For this reason, they became added challenges to the overall investigations.
4.1 Policy vs. Policy Ecosystem
Our initial focus on NIST 7628 took into account a wide range of system features included therein. As we dug deeper, it became clear that the impression of this document was relevant mostly for the system ”as-is”, not for overall analytics for cybersecurity policy of a cyber-physical system, the test case. The transition from policy to policy ecosystem was the result of an unexpected challenge that was essential to address. In earlier Quarterly Reports we reported the results of our work to build the policy ecosystem that captures all directives and guidelines required to connect cybersecurity policies to cyber-physical test-case system.
4.2 Centralized vs. Distributed Directives
By necessity, a corollary of the above is the need to deal with, and manage, a rather vexing feature of the policy ecosystem. Guidelines and directives are distributed across different documents with no apparent or explicit form of linkage for integration. This means that a massive data compilation and analysis must be undertaken in order to identify policy and policy instruments; and equally massive efforts must be done to connect policy to target points in the cyber-physical system. That, also, was not planned for at the research design phase.
4.3 Results vs. Reporting
Throughout the Project period, the practice is to report accomplishments for each quarter. In this process, the distinction between reporting accomplishments and highlighting results is often blurred or different to make. This is especially the case when “accomplishments” yield are part of the process that leads to “results”. As noted earlier, accomplishments as reported are processes or products that are necessary to generate results.
4.4 "The Devil is in the Details"
As a data-based and method-driven Project, we rapidly encounter the experience of “drowning in data” of the policy ecosystem. The sheer volume and micro level details surrounding each individual guideline, for any one “actor” or discrete point in the in the cyber-physical system is daunting. In the absence of effective theoretical or conceptual logic, we resort to “sorting out” by “deep description”.