Operationalizing Contextual Integrity - July 2018
PI(s), Co-PI(s), Researchers: Serge Egelman, Primal Wijesekera, Irwin Reyes, Julia Bernd, and Maritza Johnson (ICSI); Helen Nissenbaum (Cornell Tech)
HARD PROBLEM(S) ADDRESSED
Human Behavior: We are designing human subjects studies to examine how privacy perceptions change as a function of contextual privacy norms. Our goal is to design and develop future privacy controls that have high usability because their design principles are informed by empirical research.
Metrics: We seek to build models of human behavior by studying it in both the laboratory and the field. These models will inform the design of future privacy controls.
Policy-Governed Secure Collaboration: One goal of this project is to examine how policies surrounding the acceptable use of personal data can be adapted to support the theory of contextual integrity.
Scalability and Comporsability: Ultimately, our goal is to be able to design systems that function on contextual integrity's principles, by automatically applying inferred privacy norms from one context and applying them to future contexts.
PUBLICATIONS
Irwin Reyes, Primal Wijesekera, Joel Reardon, Amit Elazari Bar On, Abbas Razaghpanah, Narseo
Vallina-Rodriguez, and Serge Egelman. "Won't Somebody Think of the Children?" Examining COPPA Compliance at Scale. Proceedings on Privacy Enhancing Technologies (PoPETS), 2018(3):63-83.
Primal Wijesekera, Joel Reardon, Irwin Reyes, Lynn Tsai, Jung-Wei Chen, Nathan Good, David Wagner, Konstantin Beznosov, and Serge Egelman. "Contextual Permission Models for Better Privacy Protection." Symposium on Applications of Contextual Integrity, 2018.
Julia Bernd, Serge Egelman, Maritza Johnson, Nathan Malkin, Franziska Roesner, Madiha Tabassum, and Primal Wijesekera. "Studying User Expectations about Data Collection and Use by In-Home Smart Devices." Symposium on Applications of Contextual Integrity, 2018.
Nathan Malkin, Primal Wijesekera, Serge Egelman, and David Wagner. "Use Case: Passively Listening Personal Assistants." Symposium on Applications of Contextual Integrity, 2018.
KEY HIGHLIGHTS
The main effort this quarter falls under three different categories: improving infrastructure to allow us to study privacy behaviors in situ, long-term project planning to examine new ways of applying the theory of contextual integrity to privacy controls for emergent technologies (e.g., in-home IoT devices), and constructing educational materials based on our research findings for use in the classroom.
This quarter we submitted the final version of our paper on COPPA compliance at scale to the Privacy Enhancing Technologies Symposium (PETS), and then presented the work in June. This was one of only two papers that were accepted for publication without mandatory revisions. This work documents the implementation of our dynamic analysis platform, which allows us to examine the privacy behaviors of Android apps under realistic conditions. As a proof-of-concept, we applied our infrastructure to detecting violations of the Children's Online Privacy Protection Act (COPPA), finding that a majority of Android apps in the Google Play Store directed at children appear to be violating federal law. In May, we presented a version fo this work for feedback at the Privacy Law Scholars Conference (PLSC). We are in the process of adapting and improving this infrastructure to support future research activities, such as:
- Examining compliance with other privacy regulations, including the General Data Protection Regulation (GDPR) in the EU, as well as various state laws (e.g., CalOPPA in California).
- Using natural language processing to compare observed app privacy behaviors with stated practices in privacy policies.
- Creating an API to allow others to use our infrastructure, including existing regulatory agencies, who we are already collaborating with.
We are currently planning future studies in the domain of in-home IoT devices, to explore user's current privacy needs, the capabilities of current devices, and the design of future privacy controls. Our ultimate goal is to design new privacy controls that are grounded in the theory of contextual integrity so that they can automatically infer contextual norms and handle data-sharing and disclosure on a per-use basis. Toward this end, we have designed several studies surrounding both current commercially-available in-home personal assistants (e.g., Google Home, Amazon Echo, etc.) and prototypes of future devices that we expect to see. As part of this, we submitted three papers to the Symposium on Contextual Integrity, to be hosted by Co-PI Helen Nissenbaum in September at Princeton. All three were accepted, and we look forward to both disseminating our current results, as well as brainstorming our future directions with participants.
Finally, we have also begun work to develop educational materials based on the above research. Maritza Johnson and Julia Bernd (ICSI) have been developing a privacy and security curriculum suitable for K-12 classroom use. As part of this, Johnson has made connections with GenCyber programs in Virginia (where she is based), and hopes to pilot these materials with high school teachers over the coming year.
COMMUNITY ENGAGEMENTS
We presented our research at the Privacy Enhancing Technologies (PETS) symposium this month, and had three papers accepted for presentation in at the Symposium on Contextual Integrity in September. We also presented results at the Privacy Law Scholars Conference (PLSC) in May.
EDUCATIONAL ADVANCES:
We have been developing a privacy and security curriculum suitable for K-12 classrooms, and expect to pilot it later this year.