Visible to the public Contextual Integrity for Computer Systems - April 2022Conflict Detection Enabled

PI(s), Co-PI(s), Researchers: Michael Tschantz (ICSI), Helen Nissenbaum (Cornell Tech)

HARD PROBLEM(S) ADDRESSED
Scalability and Composability, Policy-Governed Secure Collaboration

- PUBLICATIONS

- KEY HIGHLIGHTS

Firstly, we have continued our work on the study of privacy risk assessments' and their role in systems' design, as well as more generally privacy management within organizations. Subjecting privacy design to a risk assessment approach within organizations is prone to lead to unexpected outcomes, viz. design choices and strategies that do not contribute to the protection of privacy. In our previous analysis we had relied on contextual integrity as a theoretical framework to reveal the presence of critical flaws in these approaches: vague definitions and lack of actuarial models, inadequate understanding of social and network effects, and a disregard for the conflicting interests of the organization, on the one hand, and broader society, on the other. These findings suggest that risk assessment, while useful, should be playing a (narrower) role in privacy engineering and organizational management at large.In addition to these earlier findings, our more recent results indicate that contextual integrity is useful not only to reveal the shortcomings of privacy risk assessments, but also to practically address these shortcomings.

Privacy, as a concept that eludes the realm of empirical science, is hard to pin down through actuarial models or empirical analyses. Scholars of risk and public policy have long noted that concepts that cannot be empirically and intersubjectively tested are poor candidates for the adoption of risk assessment approaches. Instead, theoretical models based on tried and tested societal processes often provide a more reliable set of guiding principles. Contextual integrity, being one such theoretical model that relies on established norms and values, therefore represents a better candidate to guide the design of privacy systems. Hence, in our current research we aim to understand how contextual integrity may complement (or altogether replace) risk assessment as a design and policy instrument, as well as to understand the interplay between them in the context of privacy design and policy.

Secondly, we have continued our research on the relationship between differential privacy and contextual integrity. In spite of its apparent properties and promises, differential privacy remains hard to interpret and accommodate into existing data analysis practices and processes. In our work we attempt to determine how differential privacy and contextual integrity may complement each other. While differential privacy makes no assumptions over the legitimacy of data collection and analysis; contextual integrity provides a conceptual framework to make such a determination. Similarly, differential privacy provides a mechanism to trade off data utility for potentially privacy-invasive information disclosure, yet it is oblivious to the acceptable degree of such a trade-off. Contextual integrity may help us determine whether and to what extent such a trade-off is desirable or acceptable. Moreover, differential privacy encodes a partial, narrower angle of the more capacious and general notion of privacy that contextual integrity captures. We resort to contextual integrity to better understand the hidden assumptions and limitations that underlie differential privacy as a mechanism, such as distributional effects and impact post-data collection and analysis. Lastly, our research more generally pokes at several points of friction between differential privacy advocates and theorists on the one hand, and the larger research community, particularly privacy scholars and data analysts, on the other hand.

- COMMUNITY ENGAGEMENTS

"Defining and Applying Privacy as Contextual Integrity," Data Privacy Speaker Series, School of Information and EU Center, University of Illinois, Urbana Champagne, March 22, 2022

"Contextual Integrity Up and Down the Data Food Chain," Philosophy Department Spring Colloquium, University of Michigan, March 19, 2022

"Privacy as Contextual Integrity," Keystone Strategy, (Remote) March 11, 2022

"Contextual Integrity Up and Down the Data (food) Chain", Workshop on Privacy in Machine Learning, NeurIPS, December 2021.

- EDUCATIONAL ADVANCES