Visible to the public Biblio

Filters: Keyword is Usable Security and Privacy  [Clear All Filters]
2022-08-01
Wiefling, Stephan, Tolsdorf, Jan, Iacono, Luigi Lo.  2021.  Privacy Considerations for Risk-Based Authentication Systems. 2021 IEEE European Symposium on Security and Privacy Workshops (EuroS&PW). :320—327.
Risk-based authentication (RBA) extends authentication mechanisms to make them more robust against account takeover attacks, such as those using stolen passwords. RBA is recommended by NIST and NCSC to strengthen password-based authentication, and is already used by major online services. Also, users consider RBA to be more usable than two-factor authentication and just as secure. However, users currently obtain RBA’s high security and usability benefits at the cost of exposing potentially sensitive personal data (e.g., IP address or browser information). This conflicts with user privacy and requires to consider user rights regarding the processing of personal data. We outline potential privacy challenges regarding different attacker models and propose improvements to balance privacy in RBA systems. To estimate the properties of the privacy-preserving RBA enhancements in practical environments, we evaluated a subset of them with long-term data from 780 users of a real-world online service. Our results show the potential to increase privacy in RBA solutions. However, it is limited to certain parameters that should guide RBA design to protect privacy. We outline research directions that need to be considered to achieve a widespread adoption of privacy preserving RBA with high user acceptance.
2021-06-24
Gamagedara Arachchilage, Nalin Asanka, Hameed, Mumtaz Abdul.  2020.  Designing a Serious Game: Teaching Developers to Embed Privacy into Software Systems. 2020 35th IEEE/ACM International Conference on Automated Software Engineering Workshops (ASEW). :7—12.
Software applications continue to challenge user privacy when users interact with them. Privacy practices (e.g. Data Minimisation (DM), Privacy by Design (PbD) or General Data Protection Regulation (GDPR)) and related “privacy engineering” methodologies exist and provide clear instructions for developers to implement privacy into software systems they develop that preserve user privacy. However, those practices and methodologies are not yet a common practice in the software development community. There has been no previous research focused on developing “educational” interventions such as serious games to enhance software developers' coding behaviour. Therefore, this research proposes a game design framework as an educational tool for software developers to improve (secure) coding behaviour, so they can develop privacy-preserving software applications that people can use. The elements of the proposed framework were incorporated into a gaming application scenario that enhances the software developers' coding behaviour through their motivation. The proposed work not only enables the development of privacy-preserving software systems but also helping the software development community to put privacy guidelines and engineering methodologies into practice.
2021-03-29
Distler, V., Lallemand, C., Koenig, V..  2020.  Making Encryption Feel Secure: Investigating how Descriptions of Encryption Impact Perceived Security. 2020 IEEE European Symposium on Security and Privacy Workshops (EuroS PW). :220—229.

When communication about security to end users is ineffective, people frequently misinterpret the protection offered by a system. The discrepancy between the security users perceive a system to have and the actual system state can lead to potentially risky behaviors. It is thus crucial to understand how security perceptions are shaped by interface elements such as text-based descriptions of encryption. This article addresses the question of how encryption should be described to non-experts in a way that enhances perceived security. We tested the following within-subject variables in an online experiment (N=309): a) how to best word encryption, b) whether encryption should be described with a focus on the process or outcome, or both c) whether the objective of encryption should be mentioned d) when mentioning the objective of encryption, how to best describe it e) whether a hash should be displayed to the user. We also investigated the role of context (between subjects). The verbs "encrypt" and "secure" performed comparatively well at enhancing perceived security. Overall, participants stated that they felt more secure not knowing about the objective of encryption. When it is necessary to state the objective, positive wording of the objective of encryption worked best. We discuss implications and why using these results to design for perceived lack of security might be of interest as well. This leads us to discuss ethical concerns, and we give guidelines for the design of user interfaces where encryption should be communicated to end users.

Anell, S., Gröber, L., Krombholz, K..  2020.  End User and Expert Perceptions of Threats and Potential Countermeasures. 2020 IEEE European Symposium on Security and Privacy Workshops (EuroS PW). :230—239.

Experts often design security and privacy technology with specific use cases and threat models in mind. In practice however, end users are not aware of these threats and potential countermeasures. Furthermore, mis-conceptions about the benefits and limitations of security and privacy technology inhibit large-scale adoption by end users. In this paper, we address this challenge and contribute a qualitative study on end users' and security experts' perceptions of threat models and potential countermeasures. We follow an inductive research approach to explore perceptions and mental models of both security experts and end users. We conducted semi-structured interviews with 8 security experts and 13 end users. Our results suggest that in contrast to security experts, end users neglect acquaintances and friends as attackers in their threat models. Our findings highlight that experts value technical countermeasures whereas end users try to implement trust-based defensive methods.

2020-04-13
Dechand, Sergej, Naiakshina, Alena, Danilova, Anastasia, Smith, Matthew.  2019.  In Encryption We Don’t Trust: The Effect of End-to-End Encryption to the Masses on User Perception. 2019 IEEE European Symposium on Security and Privacy (EuroS P). :401–415.
With WhatsApp's adoption of the Signal Protocol as its default, end-to-end encryption by the masses happened almost overnight. Unlike iMessage, WhatsApp notifies users that encryption is enabled, explicitly informing users about improved privacy. This rare feature gives us an opportunity to study people's understandings and perceptions of secure messaging pre-and post-mass messenger encryption (pre/post-MME). To study changes in perceptions, we compared the results of two mental models studies: one conducted in 2015 pre-MME and one in 2017 post-MME. Our primary finding is that users do not trust encryption as currently offered. When asked about encryption in the study, most stated that they had heard of encryption, but only a few understood the implications, even on a high level. Their consensus view was that no technical solution to stop skilled attackers from getting their data exists. Even with a major development, such as WhatsApp rolling out end-to-end encryption, people still do not feel well protected by their technology. Surprisingly, despite WhatsApp's end-to-end security info messages and the high media attention, the majority of the participants were not even aware of encryption. Most participants had an almost correct threat model, but don't believe that there is a technical solution to stop knowledgeable attackers to read their messages. Using technology made them feel vulnerable.