Biblio
Software defined networking (SDN) is an emerging technology for controlling flows through networks. Used in the context of industrial control systems, an objective is to design configurations that have built-in protection for hardware failures in the sense that the configuration has "baked-in" back-up routes. The objective is to leave the configuration static as long as possible, minimizing the need to have the controller push in new routing and filtering rules We have designed and implemented a tool that enables us to determine the complete connectivity map from an analysis of all switch configurations in the network. We can use this tool to explore the impact of a link failure, in particular to determine whether the failure induces loss of the ability to deliver a flow even after the built-in back-up routes are used. A measure of the original configuration's resilience to link failure is the mean number of link failures required to induce the first such loss of service. The computational cost of each link failure and subsequent analysis is large, so there is much to be gained by reducing the overall cost of obtaining a statistically valid estimate of resiliency. This paper shows that when analysis of a network state can identify all as-yet-unfailed links any one of whose failure would induce loss of a flow, then we can use the technique of importance sampling to estimate the mean number of links required to fail before some flow is lost, and analyze the potential for reducing the variance of the sample statistic. We provide both theoretical and empirical evidence for significant variance reduction.
Trust is a necessary component in cybersecurity. It is a common task for a system to make a decision about whether or not to trust the credential of an entity from another domain, issued by a third party. Generally, in the cyberspace, connected and interacting systems largely rely on each other with respect to security, privacy, and performance. In their interactions, one entity or system needs to trust others, and this "trust" frequently becomes a vulnerability of that system. Aiming at mitigating the vulnerability, we are developing a computational theory of trust, as a part of our efforts towards Science of Security. Previously, we developed a formal-semantics-based calculus of trust [3, 2], in which trust can be calculated based on a trustor's direct observation on the performance of the trustee, or based on a trust network. In this paper, we construct a framework for making trust reasoning based on the observed evidence. We take privacy in cloud computing as a driving application case [5].
The Symposium and Bootcamp on the Science of Security (HotSoS), is a research event centered on the Science of Security (SoS). Following a successful invitational SoS Community Meeting in December 2012, HotSoS 2014 was the first open research event in what we expect will be a continuing series of such events. The key motivation behind developing a Science of Security is to address the fundamental problems of cybersecurity in a principled manner. Security has been intensively studied, but a lot of previous research emphasizes the engineering of specific solutions without first developing the scientific understanding of the problem domain. All too often, security research conveys the flavor of identifying specific threats and removing them in an apparently ad hoc manner. The motivation behind the nascent Science of Security is to understand how computing systems are architected, built, used, and maintained with a view to understanding and addressing security challenges systematically across their life cycle. In particular, two features distinguish the Science of Security from previous research programs on cybersecurity. Scope. The Science of Security considers not just computational artifacts but also incorporates the human, social, and organizational aspects of computing within its purview. Approach. The Science of Security takes a decidedly scientific approach, based on the understanding of empirical evaluation and theoretical foundations as developed in the natural and social sciences, but adapted as appropriate for the "artificial science" (paraphrasing Herb Simon's term) that is computing.