Visible to the public Operationalizing Contextual Integrity - October 2021Conflict Detection Enabled

PI(s), Co-PI(s), Researchers:

  • Serge Egelman (ICSI)
  • Primal Wijesekera (ICSI)
  • Nathan Malkin (UCB)
  • Julia Bernd (ICSI)
  • Helen Nissenbaum (Cornell Tech)

HARD PROBLEM(S) ADDRESSED
Human Behavior, Metrics, Policy-Governed Secure Collaboration, and Scalability and Comporsability.

PUBLICATIONS

  • Nothing to report this quarter

KEY HIGHLIGHTS

  • We performed a study that collected people's perceptions of passive listening, their privacy preferences for it, their reactions to different modalities of permission requests, and their suggestions for other privacy controls. Based on our results, we created a set of recommendations for how users should be presented with privacy decisions for these and other future in-home data capture devices. We are in the midst of editing two papers and expect to have them published soon.
  • Our study of passive-listening devices used an interactive app store experience that provided a unique means of measuring consumer sentiment in a scenario modeling real life. Using both quantitative and qualitative analysis, we determined people's views on privacy models for always-listening voice assistants, which generally ranged from an outright rejection of the voice assistant described in our survey to preferring one model for its increased privacy protections. Only three participants (1.4%) responded that they believed there were sufficient privacy protections in place for both models they were assigned to, indicating that neither of the models are good enough by themselves for most people to be comfortable with them. The results of this study demonstrate that, as a whole, people are generally concerned about the privacy protections, or lack thereof, offered by always-listening voice assistants. This holds true despite the fact that these models may be too simplistic or incomplete; perhaps, users may have the perspective that something is better than nothing. Our findings show that consumers do seek to make choices to protect their privacy when considering new technologies, as demonstrated by the number of apps they installed, and is reinforced by explicitly inquiring about their consideration after browsing the store. Prevailing sentiments from our qualitative analysis show concerns about malicious third parties gaining access to sensitive data. We wrote this up and submitted to CHI, and are now awaiting reviews.

  • These studies led to the Nathan Malkin's dissertation, which was successfully defended this quarter. Nathan is now a postdoc at UMD, working with Michelle Mazurek, and is continuing to edit the last two chapters into the above publications.

COMMUNITY ENGAGEMENTS

  • Nothing to report this period

EDUCATIONAL ADVANCES:

  • This project forms Nathan Malkin's Ph.D. thesis, which was defended in August 2021.