Biblio
Distributed data aggregation via summation (counting) helped us to learn the insights behind the raw data. However, such computing suffered from a high privacy risk of malicious collusion attacks. That is, the colluding adversaries infer a victim's privacy from the gaps between the aggregation outputs and their source data. Among the solutions against such collusion attacks, Distributed Differential Privacy (DDP) shows a significant effect of privacy preservation. Specifically, a DDP scheme guarantees the global differential privacy (the presence or absence of any data curator barely impacts the aggregation outputs) by ensuring local differential privacy at the end of each data curator. To guarantee an overall privacy performance of a distributed data aggregation system against malicious collusion attacks, part of the existing work on such DDP scheme aim to provide an estimated lower bound of privacy budget for the global differential privacy. However, there are two main problems: low data utility from using a large global function sensitivity; unknown privacy guarantee when the aggregation sensitivity of the whole system is less than the sum of the data curator's aggregation sensitivity. To address these problems while ensuring distributed differential privacy, we provide a new lower bound of privacy budget, which works with an unconditional aggregation sensitivity of the whole distributed system. Moreover, we study the performance of our privacy bound in different scenarios of data updates. Both theoretical and experimental evaluations show that our privacy bound offers better global privacy performance than the existing work.
This paper proposes a novel privacy-preserving smart metering system for aggregating distributed smart meter data. It addresses two important challenges: (i) individual users wish to publish sensitive smart metering data for specific purposes, and (ii) an untrusted aggregator aims to make queries on the aggregate data. We handle these challenges using two main techniques. First, we propose Fourier Perturbation Algorithm (FPA) and Wavelet Perturbation Algorithm (WPA) which utilize Fourier/Wavelet transformation and distributed differential privacy (DDP) to provide privacy for the released statistic with provable sensitivity and error bounds. Second, we leverage an exponential ElGamal encryption mechanism to enable secure communications between the users and the untrusted aggregator. Standard differential privacy techniques perform poorly for time-series data as it results in a Θ(n) noise to answer n queries, rendering the answers practically useless if n is large. Our proposed distributed differential privacy mechanism relies on Gaussian principles to generate distributed noise, which guarantees differential privacy for each user with O(1) error, and provides computational simplicity and scalability. Compared with Gaussian Perturbation Algorithm (GPA) which adds distributed Gaussian noise to the original data, the experimental results demonstrate the superiority of the proposed FPA and WPA by adding noise to the transformed coefficients.
Sharing and working on sensitive data in distributed settings from healthcare to finance is a major challenge due to security and privacy concerns. Secure multiparty computation (SMC) is a viable panacea for this, allowing distributed parties to make computations while the parties learn nothing about their data, but the final result. Although SMC is instrumental in such distributed settings, it does not provide any guarantees not to leak any information about individuals to adversaries. Differential privacy (DP) can be utilized to address this; however, achieving SMC with DP is not a trivial task, either. In this paper, we propose a novel Secure Multiparty Distributed Differentially Private (SM-DDP) protocol to achieve secure and private computations in a multiparty environment. Specifically, with our protocol, we simultaneously achieve SMC and DP in distributed settings focusing on linear regression on horizontally distributed data. That is, parties do not see each others’ data and further, can not infer information about individuals from the final constructed statistical model. Any statistical model function that allows independent calculation of local statistics can be computed through our protocol. The protocol implements homomorphic encryption for SMC and functional mechanism for DP to achieve the desired security and privacy guarantees. In this work, we first introduce the theoretical foundation for the SM-DDP protocol and then evaluate its efficacy and performance on two different datasets. Our results show that one can achieve individual-level privacy through the proposed protocol with distributed DP, which is independently applied by each party in a distributed fashion. Moreover, our results also show that the SM-DDP protocol incurs minimal computational overhead, is scalable, and provides security and privacy guarantees.