Visible to the public Vulnerability and Resilience Prediction Models - July 2016Conflict Detection Enabled

Public Audience
Purpose: To highlight project progress. Information is generally at a higher level which is accessible to the interested public. All information contained in the report (regions 1-3) is a Government Deliverable/CDRL.

PI(s):  Mladen Vouk, Laurie Williams
Researchers:  Donghoon Kim, Akond Rahman

HARD PROBLEM(S) ADDRESSED

  • Security Metrics and Models
  • Resilient Architectures
  • Scalability and Composability

Resilience of software to attacks is an open problem. Resilience depends on the science behind the approach used, as well as on our engineering abilities. The scope includes recognition of attacks through metrics and models we use to describe and identify software vulnerabilities, and the models we use to predict resilience to attacks in the field (Security Metrics and Models). It also depends on the software (and system) architecture(s) used (Resilient Architectures), and their scalability (Scalability and Composability). For example, if one has a number of highly attack-resilient components and appropriate attack sensors, is it possible to compose a resilient system from these parts, and how does that solution scale and age?

Additionally, vulnerability prediction models can be used to prioritize security-related validation and verification efforts to the most risky parts of a project.  Prior studies have shown how software related metrics can be used to predict software defects. We draw inspiration from these studies and identify the possibility of applying data mining techniques to predict vulnerability.

PUBLICATIONS

ACCOMPLISHMENT HIGHLIGHTS

  • We show that well-understood data-flow, security awareness and provenance information models can be used to cost-effectively model and manage resilience of cloud-based application networks. Depending on the security priorities (e.g., Confidentiality vs. Integrity vs. Availability) of a system composed of a network of components, we construct a data-flow model of these interacting components, "learn" its normal operational profile using provenance collecting engine, and then identify vulnerable externally facing interfaces. This model then serves to position and implement appropriate countermeasures (e.g., access controls, reputation firewalls, and data integrity checks). Modeling and assessment platform (with GUIs) is based on an augmented version of Kepler (a scientific workflow modeling tool).