Visible to the public The Semantics of Privacy: From Privacy Policy Analysis to Code-Level Enforcement

Presented as part of the 2017 HCSS conference.

ABSTRACT

Security mechanisms, such as access control, provide confidentiality by restricting access to sensitive data. Unlike confidentiality, however, privacy is a quality that extends beyond the traditional software boundary and includes the kinds of decisions that can be made about an individual and the effects those decisions have on the individual's lifestyle. In addition, the incentives that often conflict with privacy are frequently linked to business models that depend upon data access. In this respect, recent guidance1 from the U.S. government has shifted the emphasis toward a context-based conception of privacy, in which the original purposes for which data Is collected are preserved and propagated to verify that subsequent uses are compliant, and to account for the privacy risk to the individual. This new emphasis highlights the need for an empirical privacy semantics, which organizations can use to reliably predict how their data collection, use and sharing practices affect personal privacy.

To address this challenge, we designed a domain specific language, called Eddy, that has a formal semantics expressed in Description Logic and enables reasoning over privacy practices commonly found in online privacy policies. This includes checking whether a policy violates the OECD collection or use limitation principles, which have been an international standard for over 35 years [1]. Using Eddy, data users can express their needs in the context of a larger privacy policy framework maintained by their organization. The framework supports sharing information with third parties and allows the users to check the OECD properties across third-party data flows and within third-party policies. This research has surfaced an unbounded semantics problem [2, 3], wherein each party uses slightly different terminology to describe and regulate personal data through policies, which is a potential source of policy ambiguity and inconsistency and which becomes an obstruction to formal analysis. To align policy analysis with system analysis, we extended our framework to check mobile app source code for privacy policy violations using static and dynamic analysis [4], and to measure privacy risk to individuals as a means to inform developers about how to prioritize privacy controls with increased data sensitivity [5].

--

Travis D. Breaux is an Associate Professor of Computer Science, appointed in the Institute for Software Research of the School of Computer Science at Carnegie Mellon University. Dr. Breaux's research program searches for new methods and tools for developing correct software specifications and ensuring that software systems conform to those specifications in a transparent, reliable and trustworthy manner. This includes demonstrating compliance with U.S. and international privacy and security laws, policies and standards. Dr. Breaux is the Director of the Requirements Engineering Laboratory at Carnegie Mellon University. Dr. Breaux has several publications in ACM and IEEE-sponsored journals and conference proceedings. Dr. Breaux is a member of the ACM SIGSOFT, IEEE Computer Society and USACM Public Policy Committee.

References:

[1] Travis D. Breaux, Daniel Smullen, Hanan Hibshi. "Detecting Repurposing and Over-collection in Multi-Party Privacy Requirements Specifications." IEEE 23rd International Requirements Engineering Conference (RE'15), pp. 166-175, Sep. 2015.

[2] Mitra Bokaei Hosseini, Travis D. Breaux, Jianwei Niu. "Inferring Ontology Fragments from Semantic Typing of Lexical Variants ." In Submission: 25th IEEE International Requirements Engineering Conference (RE'17), 2017.

[3] Morgan Evans, Jaspreet Bhatia, Sudarshan Wadkar, Travis D. Breaux "An Evaluation of Constituency-based Hyponymy Extraction from Privacy Policies" In Submission: 25th IEEE International Requirements Engineering Conference (RE'17), 2017.

[4] R. Slavin, X. Wang, M.B. Hosseini, W. Hester, R. Krishnan, J. Bhatia, T.D. Breaux, J. Niu. "Toward a Framework for Detecting Privacy Policy Violation in Android Application Code," ACM/IEEE 38th International Conference on Software Engineering, pp. 25-36, 2016.

[5] Jaspreet Bhatia, Travis D. Breaux, Joel R. Reidenberg, Thomas B. Norton. "A Theory of Vagueness and Privacy Risk Perception," 24th IEEE International Requirements Engineering Conference (RE'16), pp. 26-35, 2016.

1 See the 2016 NITRD National Privacy Research Strategy, the NISTIR 8062 Introduction to Privacy Engineering and Risk Management in Federal Systems, the 2014 White House Big Data Report, and the 2010 FTC Protecting Consumer Privacy in an Era of Rapid Change Report.

License: 
Creative Commons 2.5

Other available formats:

The Semantics of Privacy: From Privacy Policy Analysis to Code-Level Enforcement
Switch to experimental viewer