Privacy, theory

group_project

Visible to the public NSFSaTC-BSF: TWC: Small: Practical Plausibly Deniable Encryption through Low-Level Storage Device Behavior

Plausibly deniable encryption is the ability to hide that given data is on a device, whether the ability exists to decrypt it, or even that the data exists. Plausible deniability is a powerful property to protect data on devices the user has lost physical control over, such as protecting consumers from accidental mass disclosures of private data through misplaced devices. This issue is of particular concern for anyone who travels internationally with sensitive data, including human rights workers, diplomats, military personnel, or even business travelers.

group_project

Visible to the public SBE: Small: Protecting Privacy in Cyberspace: From Neuroscience Investigations to Behavioral Interventions

A key characteristic of cyberspace is the collection of large amounts of data, and people's privacy becomes vulnerable given the hyper-connectivity of cyberspace and the ease of accessing data. This project aims to enhance the safety and trustworthiness of cyberspace by designing choice architecture interventions informed by the neural processes underlying privacy to help people make better decisions about their privacy in cyberspace.

group_project

Visible to the public EAGER: Bridging The Gap between Theory and Practice in Data Privacy

This project aims to bridge the gap between theory and practice in privacy-preserving data sharing and analysis. Data collected by organizations and agencies are a key resource in today's information age. However, the disclosure of those data poses serious threats to individual privacy. While differential privacy provides a solid foundation for developing techniques to balance privacy and utility in data sharing, currently there is a significant gap between theory and practice in research in this area.

group_project

Visible to the public TWC: Medium: Collaborative: Re[DP]: Realistic Data Mining Under Differential Privacy

The collection and analysis of personal data about individuals has revolutionized information systems and fueled US and global economies. But privacy concerns regarding the use of such data loom large. Differential privacy has emerged as a gold standard for mathematically characterizing the privacy risks of algorithms using personal data. Yet, adoption of differentially private algorithms in industry or government agencies has been startlingly rare.

group_project

Visible to the public TWC: Medium: Collaborative: Privacy-Preserving Distributed Storage and Computation

This project aims at developing efficient methods for protecting the privacy of computations on outsourced data in distributed settings. The project addresses the design of an outsourced storage framework where the access pattern observed by the storage server gives no information about the actual data accessed by the client and cannot be correlated with external events. For example, the server cannot determine whether a certain item was previously accessed by the client or whether a certain algorithm is being executed.

group_project

Visible to the public TWC: Small: Provably Enforcing Practical Multi-Layer Policies in Today's Extensible Software Platforms

A defining characteristic of modern personal computing is the trend towards extensible platforms (e.g., smartphones and tablets) that run a large number of specialized applications, many of uncertain quality or provenance. The common security mechanisms available on these platforms are application isolation and permission systems. Unfortunately, it has been repeatedly shown that these mechanisms fail to prevent a range of misbehaviors, including privilege-escalation attacks and information-flow leakage.

group_project

Visible to the public TWC: Medium: Collaborative: Re[DP]: Realistic Data Mining Under Differential Privacy

The collection and analysis of personal data about individuals has revolutionized information systems and fueled US and global economies. But privacy concerns regarding the use of such data loom large. Differential privacy has emerged as a gold standard for mathematically characterizing the privacy risks of algorithms using personal data. Yet, adoption of differentially private algorithms in industry or government agencies has been startlingly rare.

group_project

Visible to the public Design, Perception, and Action - Engineering Information Give-Away

The design of social media interfaces greatly shapes how much, and when, people decide to reveal private information. For example, a designer can highlight a new system feature (e.g., your travel history displayed on a map) and show which friends are using this new addition. By making it seem as if sharing is the norm -- after all, your friends are doing it -- the designer signals to the end-user that he can and should participate and share information.