Biblio

Filters: Author is Miklau, Gerome  [Clear All Filters]
2020-01-06
Ghayyur, Sameera, Chen, Yan, Yus, Roberto, Machanavajjhala, Ashwin, Hay, Michael, Miklau, Gerome, Mehrotra, Sharad.  2018.  IoT-Detective: Analyzing IoT Data Under Differential Privacy. Proceedings of the 2018 International Conference on Management of Data. :1725–1728.
Emerging IoT technologies promise to bring revolutionary changes to many domains including health, transportation, and building management. However, continuous monitoring of individuals threatens privacy. The success of IoT thus depends on integrating privacy protections into IoT infrastructures. This demonstration adapts a recently-proposed system, PeGaSus, which releases streaming data under the formal guarantee of differential privacy, with a state-of-the-art IoT testbed (TIPPERS) located at UC Irvine. PeGaSus protects individuals' data by introducing distortion into the output stream. While PeGaSuS has been shown to offer lower numerical error compared to competing methods, assessing the usefulness of the output is application dependent. The goal of the demonstration is to assess the usefulness of private streaming data in a real-world IoT application setting. The demo consists of a game, IoT-Detective, in which participants carry out visual data analysis tasks on private data streams, earning points when they achieve results similar to those on the true data stream. The demo will educate participants about the impact of privacy mechanisms on IoT data while at the same time generating insights into privacy-utility trade-offs in IoT applications.
2017-05-22
Hay, Michael, Machanavajjhala, Ashwin, Miklau, Gerome, Chen, Yan, Zhang, Dan.  2016.  Principled Evaluation of Differentially Private Algorithms Using DPBench. Proceedings of the 2016 International Conference on Management of Data. :139–154.

Differential privacy has become the dominant standard in the research community for strong privacy protection. There has been a flood of research into query answering algorithms that meet this standard. Algorithms are becoming increasingly complex, and in particular, the performance of many emerging algorithms is data dependent, meaning the distribution of the noise added to query answers may change depending on the input data. Theoretical analysis typically only considers the worst case, making empirical study of average case performance increasingly important. In this paper we propose a set of evaluation principles which we argue are essential for sound evaluation. Based on these principles we propose DPBench, a novel evaluation framework for standardized evaluation of privacy algorithms. We then apply our benchmark to evaluate algorithms for answering 1- and 2-dimensional range queries. The result is a thorough empirical study of 15 published algorithms on a total of 27 datasets that offers new insights into algorithm behavior–-in particular the influence of dataset scale and shape–-and a more complete characterization of the state of the art. Our methodology is able to resolve inconsistencies in prior empirical studies and place algorithm performance in context through comparison to simple baselines. Finally, we pose open research questions which we hope will guide future algorithm design.