Visible to the public Differentially Private Distributed Convex Optimization via Functional Perturbation

TitleDifferentially Private Distributed Convex Optimization via Functional Perturbation
Publication TypeJournal Article
Year of Publication2017
AuthorsE. Nozari, P. Tallapragada, J. Cortes
Keywords1329619
Abstract

We study a class of distributed convex constrained optimization problem where a group of agents aims to minimize the sum of individual objective functions while each desires to keep its function differentially private. We prove the impossibility of achieving differential privacy using strategies based on perturbing with noise the inter-agent messages when the underlying noise-free dynamics is asymptotically stable. This justifies our algorithmic solution based on the perturbation of the individual objective functions with Laplace noise within the framework of functional differential privacy. We carefully design post-processing steps that ensure the perturbed functions regain the smoothness and convexity properties of the original functions while preserving the differentially private guarantees of the functional perturbation step. This methodology allows to use any distributed coordination algorithm to solve the optimization problem on the noisy functions. Finally, we explicitly bound the magnitude of the expected distance between the perturbed and true optimizers, and characterize the privacy-accuracy trade-off. Simulations illustrate our results.

Notes

To appear

Citation KeyEN-PT-JC:17-tcns