Visible to the public Scalable Privacy AnalysisConflict Detection Enabled

Project Details

Lead PI

Performance Period

Jan 01, 2018 - Jan 01, 2018

Institution(s)

International Computer Science Institute

Ranked 69 out of 118 Group Projects in this group.
3962 related hits.

One major shortcoming of the current "notice and consent" privacy framework is that the constraints for data usage stated in policies--be they stated privacy practices, regulation, or laws--cannot easily be compared against the technologies that they govern. To that end, we are developing a framework to automatically compare policy against practice. Broadly, this involves identifying the relevant data usage policies and practices in a given domain, then measuring the real-world exchanges of data restricted by those rules. The results of such a method will then be used to measure and predict the harms brought onto the data's subjects and holders in the event of its unauthorized usage. In doing so, we will be able to infer which specific protected pieces of information, individual prohibited operations on that data, and aggregations thereof pose the highest risks compared to other items covered by the policy. This will shed light on the relationship between the unwanted collection of data, its usage and dissemination, and resulting negative consequences.

We have built infrastructure into the Android operating system, whereby we have heavily instrumented both the permission-checking APIs and included network-monitory functionality. This allows us to monitor when an application attempts to access protected data (e.g., PII, persistent identifiers, etc.) and what it does with it. Unlike static analysis techniques, which only detect the potential for certain behaviors (e.g., data exfiltration), executing applications with our instrumentation yields real-time observations of actual privacy violations. The only drawback, however, is that applications need to be executed, and broad code coverage is desired. To date, we have demonstrated that many privacy violations are detectable when application user interfaces are "fuzzed" using random input. However, there are many open research questions about how we can yield better code coverage to detect a wider range of privacy- related events, while doing so in a scalable manner. Towards that end, we plan to virtualize our privacy testbed and integrate crowd-sourcing. By doing this, we will develop new methods for performing privacy experiments that are repeatable, rigorous, and gen- eralizable. The results of these experiments can then be used to implement data-driven privacy controls, address gaps in regulation, and enforce existing regulations.

Serge Egelman is Research Director of the Usable Security & Privacy Group at the International Computer Science Institute (ICSI) and also holds an appointment in the Department of Electrical Engineering and Computer Sciences (EECS) at the University of California, Berkeley. He leads the Berkeley Laboratory for Usable and Experimental Security (BLUES), which is the amalgamation of his ICSI and UCB research groups. Serge's research focuses on the intersection of privacy, computer security, and human-computer interaction, with the specific aim of better understanding how people make decisions surrounding their privacy and security, and then creating data-driven improvements to systems and interfaces. This has included human subjects research on social networking privacy, access controls, authentication mechanisms, web browser security warnings, and privacy-enhancing technologies. His work has received multiple best paper awards, including seven ACM CHI Honorable Mentions, the 2012 Symposium on Usable Privacy and Security (SOUPS) Distinguished Paper Award for his work on smartphone application permissions, as well as the 2017 SOUPS Impact Award, and the 2012 Information Systems Research Best Published Paper Award for his work on consumers' willingness to pay for online privacy. He received his PhD from Carnegie Mellon University and prior to that was an undergraduate at the University of Virginia. He has also performed research at NIST, Brown University, Microsoft Research, and Xerox PARC.