A Cross-Domain Comparable Measurement Framework to Quantify Intrusion Detection Effectiveness
Title | A Cross-Domain Comparable Measurement Framework to Quantify Intrusion Detection Effectiveness |
Publication Type | Conference Paper |
Year of Publication | 2016 |
Authors | Strasburg, Chris, Basu, Samik, Wong, Johnny |
Conference Name | Proceedings of the 11th Annual Cyber and Information Security Research Conference |
Publisher | ACM |
Conference Location | New York, NY, USA |
ISBN Number | 978-1-4503-3752-6 |
Keywords | Automation, composability, Formal Security Models, Intrusion Detection Systems, Measurement, Metrics, network intrusion detection, pubcrawl, Resiliency |
Abstract | As the frequency, severity, and sophistication of cyber attacks increase, along with our dependence on reliable computing infrastructure, the role of Intrusion Detection Systems (IDS) gaining importance. One of the challenges in deploying an IDS stems from selecting a combination of detectors that are relevant and accurate for the environment where security is being considered. In this work, we propose a new measurement approach to address two key obstacles: the base-rate fallacy, and the unit of analysis problem. Our key contribution is to utilize the notion of a `signal', an indicator of an event that is observable to an IDS, as the measurement target, and apply the multiple instance paradigm (from machine learning) to enable cross-comparable measures regardless of the unit of analysis. To support our approach, we present a detailed case study and provide empirical examples of the effectiveness of both the model and measure by demonstrating the automated construction, optimization, and correlation of signals from different domains of observation (e.g. network based, host based, application based) and using different IDS techniques (signature based, anomaly based). |
URL | http://doi.acm.org/10.1145/2897795.2897816 |
DOI | 10.1145/2897795.2897816 |
Citation Key | strasburg_cross-domain_2016 |