Visible to the public Biblio

Filters: Keyword is security decisions  [Clear All Filters]
2021-04-08
Jin, R., He, X., Dai, H..  2019.  On the Security-Privacy Tradeoff in Collaborative Security: A Quantitative Information Flow Game Perspective. IEEE Transactions on Information Forensics and Security. 14:3273–3286.
To contest the rapidly developing cyber-attacks, numerous collaborative security schemes, in which multiple security entities can exchange their observations and other relevant data to achieve more effective security decisions, are proposed and developed in the literature. However, the security-related information shared among the security entities may contain some sensitive information and such information exchange can raise privacy concerns, especially when these entities belong to different organizations. With such consideration, the interplay between the attacker and the collaborative entities is formulated as Quantitative Information Flow (QIF) games, in which the QIF theory is adapted to measure the collaboration gain and the privacy loss of the entities in the information sharing process. In particular, three games are considered, each corresponding to one possible scenario of interest in practice. Based on the game-theoretic analysis, the expected behaviors of both the attacker and the security entities are obtained. In addition, the simulation results are presented to validate the analysis.
2019-06-17
Frey, Sylvain, Rashid, Awais, Anthonysamy, Pauline, Pinto-Albuquerque, Maria, Naqvi, Syed Asad.  2018.  The Good, the Bad and the Ugly: A Study of Security Decisions in a Cyber-Physical Systems Game. Proceedings of the 40th International Conference on Software Engineering. :496-496.

Motivation: The security of any system is a direct consequence of stakeholders' decisions regarding security requirements. Such decisions are taken with varying degrees of expertise, and little is currently understood about how various demographics - security experts, general computer scientists, managers - approach security decisions and the strategies that underpin those decisions. What are the typical decision patterns, the consequences of such patterns and their impact on the security of the system in question? Nor is there any substantial understanding of how the strategies and decision patterns of these different groups contrast. Is security expertise necessarily an advantage when making security decisions in a given context? Answers to these questions are key to understanding the "how" and "why" behind security decision processes. The Game: In this talk1, we present a tabletop game: Decisions and Disruptions (D-D)2 that tasks a group of players with managing the security of a small utility company while facing a variety of threats. The game is kept short - 2 hours - and simple enough to be played without prior training. A cyber-physical infrastructure, depicted through a Lego\textregistered board, makes the game easy to understand and accessible to players from varying backgrounds and security expertise, without being too trivial a setting for security experts. Key insights: We played D-D with 43 players divided into homogeneous groups: 4 groups of security experts, 4 groups of nontechnical managers and 4 groups of general computer scientists. • Strategies: Security experts had a strong interest in advanced technological solutions and tended to neglect intelligence gathering, to their own detriment. Managers, too, were technology-driven and focused on data protection while neglecting human factors more than other groups. Computer scientists tended to balance human factors and intelligence gathering with technical solutions, and achieved the best results of the three demographics. • Decision Processes: Technical experience significantly changes the way players think. Teams with little technical experience had shallow, intuition-driven discussions with few concrete arguments. Technical teams, and the most experienced in particular, had much richer debates, driven by concrete scenarios, anecdotes from experience, and procedural thinking. Security experts showed a high confidence in their decisions - despite some of them having bad consequences - while the other groups tended to doubt their own skills - even when they were playing good games. • Patterns: A number of characteristic plays were identified, some good (balance between priorities, open-mindedness, and adapting strategies based on inputs that challenge one's pre-conceptions), some bad (excessive focus on particular issues, confidence in charismatic leaders), some ugly ("tunnel vision" syndrome by over-confident players). These patterns are documented in the full paper - showing the virtue of the positive ones, discouraging the negative ones, and inviting the readers to do their own introspection. Conclusion: Beyond the analysis of the security decisions of the three demographics, there is a definite educational and awareness-raising aspect to D-D (as noted consistently by players in all our subject groups). Game boxes will be brought to the conference for demonstration purposes, and the audience will be invited to experiment with D-D themselves, make their own decisions, and reflect on their own perception of security.

2014-09-26
Howe, AE., Ray, I, Roberts, M., Urbanska, M., Byrne, Z..  2012.  The Psychology of Security for the Home Computer User. Security and Privacy (SP), 2012 IEEE Symposium on. :209-223.

The home computer user is often said to be the weakest link in computer security. They do not always follow security advice, and they take actions, as in phishing, that compromise themselves. In general, we do not understand why users do not always behave safely, which would seem to be in their best interest. This paper reviews the literature of surveys and studies of factors that influence security decisions for home computer users. We organize the review in four sections: understanding of threats, perceptions of risky behavior, efforts to avoid security breaches and attitudes to security interventions. We find that these studies reveal a lot of reasons why current security measures may not match the needs or abilities of home computer users and suggest future work needed to inform how security is delivered to this user group.