Visible to the public Operationalizing Contextual Integrity - October 2022Conflict Detection Enabled

PI(s), Co-PI(s), Researchers:

  • Serge Egelman (ICSI)
  • Primal Wijesekera (ICSI)
  • Julia Bernd (ICSI)
  • Helen Nissenbaum (Cornell Tech)

HARD PROBLEM(S) ADDRESSED
Human Behavior, Metrics, Policy-Governed Secure Collaboration, and Scalability and Comporsability.

PUBLICATIONS

  • Presented:
    Nathan Malkin, David Wagner, and Serge Egelman. Runtime Permissions for Privacy in Proactive Intelligent Assistants. In Proceedings of the 18th Symposium on Usable Privacy and Security (SOUPS '22). USENIX Assoc., Berkeley, CA, USA. 2022.
  • Presented:
    Nathan Malkin, David Wagner, and Serge Egelman. 2022. Can Humans Detect Malicious Always-Listening
    Assistants? A Framework for Crowdsourcing Test Drives. Proc. ACM Hum.-Comput. Interact. 6, CSCW2,
    Article 500 (November 2022), 44 pages.
  • Presented:
    Julia Bernd, Ruba Abu-Salma, Junghyun Choy, and Alisa Frik. Balancing Power Dynamics in Smart Homes: Nannies' Perspectives on How Cameras Reflect and Affect Relationships. In Proceedings of the 18th Symposium on Usable Privacy and Security (SOUPS '22). USENIX Assoc., Berkeley, CA, USA. 2022.
  • Accepted:
    C. Gilsenan, F. Shakir, N. Alomar, and S. Egelman. Security and Privacy Failures in Popular 2FA Apps. Proceedings of the 2023 USENIX Security Symposium.
  • Submitted:
    Nikita Samarin, Shayna Kothari, Zaina Siyed, Oscar Bjorkman, Reena Yuan, Primal Wijesekera, Noura Alomar, Jordan Fischer, Chris Hoofnagle, and Serge Egelman. Measuring the Compliance of Android App Developers with the California Consumer Privacy Act (CCPA). Submitted to PETS 2023.

KEY HIGHLIGHTS

  • We have been collecting data from ~30 ad networks to analyze bid data under different contexts, to examine whether or not privacy controls are functioning as expected. Right now, our experiments have been focused on whether pricing changes as a function of "opt-out" flags (which would normally be sent by apps). We have several related experiments that we are conducting and expect to have a publication submitted this winter.
  • Our team presented two papers at the Eighteenth Symposium on Usable Privacy and Security (SOUPS 2022). SOUPS brings together an interdisciplinary group of researchers and practitioners in human-computer interaction, security, and privacy.

    • Balancing Power Dynamics in Smart Homes: Nannies' Perspectives on How Cameras Reflect and Affect Relationships. Abstract: Smart home cameras raise privacy concerns in part because they frequently collect data not only about the primary users who deployed them but also other parties--who may be targets of intentional surveillance or incidental bystanders. Domestic employees working in smart homes must navigate a complex situation that blends privacy and social norms for homes, workplaces, and caregiving. This paper presents findings from 25 semi-structured interviews with domestic childcare workers in the U.S. about smart home cameras, focusing on how privacy considerations interact with the dynamics of their employer-employee relationships. We show how participants' views on camera data collection, and their desire and ability to set conditions on data use and sharing, were affected by power differentials and norms about who should control information flows in a given context. Participants' attitudes about employers' cameras often hinged on how employers used the data; whether participants viewed camera use as likely to reinforce negative tendencies in the employer-employee relationship; and how camera use and disclosure might reflect existing relationship tendencies. We also suggest technical and social interventions to mitigate the adverse effects of power imbalances on domestic employees' privacy and individual agency.

    • Runtime Permissions for Privacy in Proactive Intelligent Assistants. Abstract: Intelligent voice assistants may soon become proactive, offering suggestions without being directly invoked. Such behavior increases privacy risks since proactive operation requires continuous monitoring of conversations. To mitigate this problem, our study proposes and evaluates one potential privacy control, in which the assistant requests permission for the information it wishes to use immediately after hearing it. To find out how people would react to runtime permission requests, we recruited 23 pairs of participants to hold conversations while receiving ambient suggestions from a proactive assistant, which we simulated in real time using the Wizard of Oz technique. The interactive sessions featured different modes and designs of runtime permission requests and were followed by in-depth interviews about people's preferences and concerns. Most participants were excited about the devices despite their continuous listening but wanted control over the assistant's actions and their own data. They generally prioritized an interruption-free experience above more fine-grained control over what the device would hear.

  • We are putting the final touches on our paper that will be presented in November at the 25th ACM Conference on Computer-Supported Cooperative Work and Social Computing (CSCW). CSCW brings together top researchers and practitioners to explore the technical, social, material, and theoretical challenges of designing technology to support collaborative work and life activities.

    • Can Humans Detect Malicious Always-Listening Assistants? A Framework for Crowdsourcing Test Drives. Abstract: Intelligent voice assistants are growing in popularity and functionality. Continuous listening is one feature on the horizon. With this capability, malicious actors could train assistants to listen to audio outside their purview, harming users' privacy and security. How can this misbehavior be detected? In many cases, identification may rely on human abilities. But how good are humans at this task? To investigate, we developed a Wizard of Oz interface that allowed users to perform real-time "Test Drives" of three different always-listening services. We then conducted a study with 200 participants, seeing whether they could detect one of four types of malicious apps. We studied the behavior of individuals, as well as groups working collaboratively, also investigating the effects of task framing on performance. Our paper reports on people's effectiveness and their experiences with this novel transparency mechanism.

  • California Consumer Privacy Act (CCPA) compliance. In this work, we investigated CCPA compliance for 160 top-ranked Android mobile app developers from the U.S. Google Play Store. CCPA requires developers to provide accurate privacy notices and to respond to "right to know" requests by disclosing personal information that they have collected, used, or shared about consumers for a business or commercial purpose. We found that at least 39% of the apps we studied shared device-specific identifiers and at least 26% shared geolocation information with third parties without disclosing it in response to our request. This work was submitted to the 22nd Privacy Enhancing Technologies Symposium (PETS). PETS brings together privacy experts from around the world to discuss recent advances and new perspectives on research in privacy technologies.

    • Measuring the Compliance of Android App Developers with the California Consumer Privacy Act (CCPA). Abstract: The California Consumer Privacy Act (CCPA) provides California residents with a range of enhanced privacy protections and rights. Our research investigated the extent to which Android app developers comply with the provisions of the CCPA that require them to provide consumers with accurate privacy notices and respond to "right to know" requests by disclosing personal information that they have collected, used, or shared about consumers for a business or commercial purpose. We compared the actual network traffic of 109 apps that we believe must comply with the CCPA to the data that apps state they collect in their privacy policies and the data contained in responses to "right to know" requests that we submitted to the app's developers. Of the 69 app developers who substantively replied to our requests, all but one provided specific pieces of personal data (as opposed to only categorical information). However, a significant percentage of apps collected information that was not disclosed, including identifiers (64 apps, 93%), geolocation data (25 apps, 36%), and customer records information (13 apps, 19%) among other categories. We discuss improvements to the CCPA that could help app developers comply with "right to know" and related regulations.

COMMUNITY ENGAGEMENTS

The SOUPS 2022 full proceedings, presentation slides, and videos of the presentations are free and open to the public on the technical sessions page (https://www.usenix.org/conference/soups2022/technical-sessions).

EDUCATIONAL ADVANCES:

  • Several undergraduate and graduate students assisted with this research.