Reduce Privacy Risks

group_project

Visible to the public Breakthrough: Collaborative: Secure Algorithms for Cyber-Physical Systems

Modern systems such as the electric smart grid consist of both cyber and physical components that must work together; these are called cyber-physical systems, or CPS. Securing such systems goes beyond just cyber security or physical security into cyber-physical security. While the threats multiply within a CPS, physical aspects also can reduce the threat space. Unlike purely cyber systems, such as the internet, CPS are grounded in physical reality.

group_project

Visible to the public CAREER: Differentially-Private Machine Learning with Applications to Biomedical Informatics

Machine learning on large-scale patient medical records can lead to the discovery of novel population-wide patterns enabling advances in genetics, disease mechanisms, drug discovery, healthcare policy, and public health. However, concerns over patient privacy prevent biomedical researchers from running their algorithms on large volumes of patient data, creating a barrier to important new discoveries through machine-learning. The goal of this project is to address this barrier by developing privacy-preserving tools to query, cluster, classify and analyze medical databases.

group_project

Visible to the public CAREER: Privacy-preserving learning for distributed data

Medical technologies such as imaging and sequencing make it possible to gather massive amounts of information at increasingly lower cost. Sharing data from studies can advance scientific understanding and improve healthcare outcomes. Concern about patient privacy, however, can preclude open data sharing, thus hampering progress in understanding stigmatized conditions such as mental health disorders.

group_project

Visible to the public CAREER: Privacy-Guaranteed Distributed Interactions in Critical Infrastructure Networks

Information sharing between operators (agents) in critical infrastructure systems such as the Smart Grid is fundamental to reliable and sustained operation. The contention, however, between sharing data for system stability and reliability (utility) and withholding data for competitive advantage (privacy) has stymied data sharing in such systems, sometimes with catastrophic consequences. This motivates a data sharing framework that addresses the competitive interests and information leakage concerns of agents and enables timely and controlled information exchange.

group_project

Visible to the public CAREER: Privacy Analytics for Users in a Big Data World

Increasing amounts of data are being collected about users, and increasingly sophisticated analytics are being applied to this data for various purposes. Privacy analytics are machine learning and data mining algorithms applied by end-users to their data for the purpose of helping them manage both private information and their self-presentation.

group_project

Visible to the public TWC: Frontier: Privacy Tools for Sharing Research Data

Information technology, advances in statistical computing, and the deluge of data available through the Internet are transforming computational social science. However, a major challenge is maintaining the privacy of human subjects. This project is a broad, multidisciplinary effort to help enable the collection, analysis, and sharing of sensitive data while providing privacy for individual subjects.

group_project

Visible to the public TC: Large: Collaborative Research: Privacy-Enhanced Secure Data Provenance

Data provenance refers to the history of the contents of an object and its successive transformations. Knowledge of data provenance is beneficial to many ends, such as enhancing data trustworthiness, facilitating accountability, verifying compliance, aiding forensics, and enabling more effective access and usage controls. Provenance data minimally needs integrity assurance to realize these benefits.

group_project

Visible to the public TC: Large: Collaborative Research: Practical Secure Two-Party Computation: Techniques, Tools, and Applications

Many compelling applications involve computations that require sensitive data from two or more individuals. For example, as the cost of personal genome sequencing rapidly plummets many genetics applications will soon be within reach of individuals such as comparing one?s genome with the genomes of different groups of participants in a study to determine which treatment is likely to be most effective. Such comparisons could have tremendous value, but are currently infeasible because of the privacy concerns both for the individual and study participants.

group_project

Visible to the public TC: Large: Collaborative Research: Practical Secure Two-Party Computation: Techniques, Tools, and Applications

Many compelling applications involve computations that require sensitive data from two or more individuals. For example, as the cost of personal genome sequencing rapidly plummets many genetics applications will soon be within reach of individuals such as comparing one?s genome with the genomes of different groups of participants in a study to determine which treatment is likely to be most effective. Such comparisons could have tremendous value, but are currently infeasible because of the privacy concerns both for the individual and study participants.

group_project

Visible to the public TWC SBE: Small: Towards an Economic Foundation of Privacy-Preserving Data Analytics: Incentive Mechanisms and Fundamental Limits

The commoditization of private data has been trending up, as big data analytics is playing a more critical role in advertising, scientific research, etc. It is becoming increasingly difficult to know how data may be used, or to retain control over data about oneself. One common practice of collecting private data is based on "informed consent", where data subjects (individuals) decide whether to report data or not, based upon who is collecting the data, what data is collected, and how the data will be used.