Visible to the public Governance for Big Data - October 2022Conflict Detection Enabled

PI(s), Co-PI(s), Researchers:

  • Serge Egelman (ICSI)
  • Julia Bernd (ICSI)

HARD PROBLEM(S) ADDRESSED
Human Behavior, Policy-Governed Secure Collaboration

PUBLICATIONS

  • Presented:
    Mohammad Tahaei, Julia Bernd, and Awais Rashid. 2022. Privacy, Permissions, and the Health App Ecosystem: A StackOverflow Exploration. In Proceedings of the 2022 European Symposium on Usable Security (EuroUSEC '22). Association for Computing Machinery, New York, NY, USA, 117-130. https://doi.org/10.1145/3549015.3555669

KEY HIGHLIGHTS

  • Our paper with our colleagues Mohammad Tahaei and Awais Rashid at University of Bristol, "Privacy, Permissions, and the Health App Ecosystem: A Stack Overflow Exploration," described in our January 2022 report, analyzes Stack Overflow posts by developers of health apps to illuminate how the data governance mechanisms of various stakeholders, especially the major mobile platforms, affect their approach to health app privacy. This paper was presented at EuroUSEC '22, which brings together an interdisciplinary group of researchers and practitioners in human-computer interaction, security, and privacy.
  • We are continuing with our survey study, first described in our April 2022 report, that examines the relationship between U.S. consumers' expectations about how different types of apps will handle user data, and their assumptions about sector-specific laws regulating handling of health data. The study is examining users' expectations and preferences around who is responsible and who should be responsible for regulating data collection and handling, and whether users have different expectations and preferences about regulation of health vs. other types of data.
    • We revised the survey based on insights from user-testing walkthroughs and a small pilot.
    • We collected data from 300 participants. We showed each participant one out of six possible app descriptions (where the made-up apps had different purposes, some of which were medical or health-related and some of which were not), asked them to make some guesses about the app's data practices. We then asked series of questions about whether those practices were legal and allowed by app stores (and whether they should be), along with questions designed to confirm whether the participants view the apps or data practices as medical/health-related.
    • We are beginning quantitative analysis of the data.
  • We are also contributing to the design of a separate survey (led by others in our research group) that will compare users' expectations about the data practices and privacy policies of actual health apps with what is really going on based on network traffic analysis.
  • We have started a new study that will evaluate the claims data brokers make about consent for data collection. We plan to ask data brokers if we can purchase or obtain free samples of anonymous data sets that were collected with consent. We will then deanonymize this data and survey the people from which the data was collected to find out what they recall about granting consent. We are in the process of identifying data brokers and are consulting with lawyers on the legal aspects of deanonymizing the data.

COMMUNITY ENGAGEMENTS

  • Research presentations described above; Egelman was asked by Senate oversight committee staff to comment on recent FTC actions surrounding data brokers, and is also working on comments for the FTC's ANPR concerning governance for big data (and commercial surveillance).

EDUCATIONAL ADVANCES:

  • Our user study is being led by a UC Berkeley computer science grad student.