differential privacy

group_project

Visible to the public TWC SBES: Medium: Utility for Private Data Sharing in Social Science

One of the keys to scientific progress is the sharing of research data. When the data contain information about human subjects, the incentives not to share data are stronger. The biggest concern is privacy - specific information about individuals must be protected at all times. Recent advances in mathematical notions of privacy have raised the hope that the data can be properly sanitized and distributed to other research groups without revealing information about any individual. In order to make this effort worthwhile, the sanitized data must be useful for statistical analysis.

group_project

Visible to the public TWC: Medium: Collaborative: Re[DP]: Realistic Data Mining Under Differential Privacy

The collection and analysis of personal data about individuals has revolutionized information systems and fueled US and global economies. But privacy concerns regarding the use of such data loom large. Differential privacy has emerged as a gold standard for mathematically characterizing the privacy risks of algorithms using personal data. Yet, adoption of differentially private algorithms in industry or government agencies has been startlingly rare.

group_project

Visible to the public CAREER: PROTEUS: A Practical and Rigorous Toolkit for Privacy

Statistical privacy, or the problem of disclosing aggregate statistics about data collected from individuals while ensuring the privacy of individual level sensitive properties, is an important problem in today's age of big data. The key challenge in statistical privacy is that applications for data collection and analysis operate on varied kinds of data, and have diverse requirements for the information that must be kept secret, and the adversaries that they must tolerate.

file

Visible to the public Differentially Private Average Consensus: Obstructions, Trade-Offs, and Optimal Algorithm Design

This paper studies the multi-agent average consensus problem under the requirement of differential privacy of the agents' initial states against an adversary that has access to all the messages. We first establish that a differentially private consensus algorithm cannot guarantee convergence of the agents' states to the exact average in distribution, which in turn implies the same impossibility for other stronger notions of convergence.