Biblio
We address the need for security requirements to take into account risks arising from complex supply chains underpinning cyber-physical infrastructures such as industrial control systems (ICS). We present SEISMiC (SEcurity Industrial control SysteM supply Chains), a framework that takes into account the whole spectrum of security risks - from technical aspects through to human and organizational issues - across an ICS supply chain. We demonstrate the effectiveness of SEISMiC through a supply chain risk assessment of Natanz, Iran's nuclear facility that was the subject of the Stuxnet attack.
Motivation: The security of any system is a direct consequence of stakeholders' decisions regarding security requirements. Such decisions are taken with varying degrees of expertise, and little is currently understood about how various demographics - security experts, general computer scientists, managers - approach security decisions and the strategies that underpin those decisions. What are the typical decision patterns, the consequences of such patterns and their impact on the security of the system in question? Nor is there any substantial understanding of how the strategies and decision patterns of these different groups contrast. Is security expertise necessarily an advantage when making security decisions in a given context? Answers to these questions are key to understanding the "how" and "why" behind security decision processes. The Game: In this talk1, we present a tabletop game: Decisions and Disruptions (D-D)2 that tasks a group of players with managing the security of a small utility company while facing a variety of threats. The game is kept short - 2 hours - and simple enough to be played without prior training. A cyber-physical infrastructure, depicted through a Lego\textregistered board, makes the game easy to understand and accessible to players from varying backgrounds and security expertise, without being too trivial a setting for security experts. Key insights: We played D-D with 43 players divided into homogeneous groups: 4 groups of security experts, 4 groups of nontechnical managers and 4 groups of general computer scientists. • Strategies: Security experts had a strong interest in advanced technological solutions and tended to neglect intelligence gathering, to their own detriment. Managers, too, were technology-driven and focused on data protection while neglecting human factors more than other groups. Computer scientists tended to balance human factors and intelligence gathering with technical solutions, and achieved the best results of the three demographics. • Decision Processes: Technical experience significantly changes the way players think. Teams with little technical experience had shallow, intuition-driven discussions with few concrete arguments. Technical teams, and the most experienced in particular, had much richer debates, driven by concrete scenarios, anecdotes from experience, and procedural thinking. Security experts showed a high confidence in their decisions - despite some of them having bad consequences - while the other groups tended to doubt their own skills - even when they were playing good games. • Patterns: A number of characteristic plays were identified, some good (balance between priorities, open-mindedness, and adapting strategies based on inputs that challenge one's pre-conceptions), some bad (excessive focus on particular issues, confidence in charismatic leaders), some ugly ("tunnel vision" syndrome by over-confident players). These patterns are documented in the full paper - showing the virtue of the positive ones, discouraging the negative ones, and inviting the readers to do their own introspection. Conclusion: Beyond the analysis of the security decisions of the three demographics, there is a definite educational and awareness-raising aspect to D-D (as noted consistently by players in all our subject groups). Game boxes will be brought to the conference for demonstration purposes, and the audience will be invited to experiment with D-D themselves, make their own decisions, and reflect on their own perception of security.
In this vision paper, we focus on a key aspect of the modern software developer's potential to write secure software: their (lack of) success in securely using cryptography APIs. In particular, we note that most ongoing research tends to focus on identifying concrete problems software developers experience, and providing workable solutions, but that such solutions still require developers to identify the appropriate API calls to make and, worse, to be familiar with and configure sometimes obscure parameters of such calls. In contrast, we envision identifying and employing targeted visual metaphors to allow developers to simply select the most appropriate cryptographic functionality they need.
Securing cyber-physical systems is hard. They are complex infrastructures comprising multiple technological artefacts, designers, operators and users. Existing research has established the security challenges in such systems as well as the role of usable security to support humans in effective security decisions and actions. In this paper we focus on smart cyber-physical systems, such as those based on the Internet of Things (IoT). Such smart systems aim to intelligently automate a variety of functions, with the goal of hiding that complexity from the user. Furthermore, the interactions of the user with such systems are more often implicit than explicit, for instance, a pedestrian with wearables walking through a smart city environment will most likely interact with the smart environment implicitly through a variety of inferred preferences based on previously provided or automatically collected data. The key question that we explore is that of empowering software engineers to pragmatically take into account how users make informed security choices about their data and information in such a pervasive environment. We discuss a range of existing frameworks considering the impact of automation on user behaviours and argue for the need of a shift–-from usability to security ergonomics as a key requirement when designing and implementing security features in smart cyber-physical environments. Of course, the considerations apply more broadly than security but, in this paper, we focus only on security as a key concern.
Inadvertent exposure of sensitive data is a major concern for potential cloud customers. Much focus has been on other data leakage vectors, such as side channel attacks, while issues of data disposal and assured deletion have not received enough attention to date. However, data that is not properly destroyed may lead to unintended disclosures, in turn, resulting in heavy financial penalties and reputational damage. In non-cloud contexts, issues of incomplete deletion are well understood. To the best of our knowledge, to date, there has been no systematic analysis of assured deletion challenges in public clouds. In this paper, we aim to address this gap by analysing assured deletion requirements for the cloud, identifying cloud features that pose a threat to assured deletion, and describing various assured deletion challenges. Based on this discussion, we identify future challenges for research in this area and propose an initial assured deletion architecture for cloud settings. Altogether, our work offers a systematization of requirements and challenges of assured deletion in the cloud, and a well-founded reference point for future research in developing new solutions to assured deletion.
As cyber-physical systems (CPS) become prevalent in everyday life, it is critical to understand the factors that may impact the security of such systems. In this paper, we present insights from an initial study of historical security incidents to analyse such factors for a particular class of CPS: industrial control systems (ICS). Our study challenges the usual tendency to blame human fallibility or resort to simple explanations for what are often complex issues that lead to a security incident. We highlight that (i) perception errors are key in such incidents (ii) latent design conditions – e.g., improper specifications of a system's borders and capabilities – play a fundamental role in shaping perceptions, leading to security issues. Such design-time considerations are particularly critical for ICS, the life-cycle of which is usually measured in decades. Based on this analysis, we discuss how key characteristics of future smart CPS in such industrial settings can pose further challenges with regards to tackling latent design flaws.