Visible to the public Differential Privacy in Linear Distributed Control Systems: Entropy Minimizing Mechanisms and Performance TradeoffsConflict Detection Enabled

TitleDifferential Privacy in Linear Distributed Control Systems: Entropy Minimizing Mechanisms and Performance Tradeoffs
Publication TypeJournal Article
Year of Publication2017
AuthorsYu Wang, University of Illinois at Urbana-Champaign, Zhenqi Huang, University of Illinois at Urbana-Champaign, Sayan Mitra, University of Illinois at Urbana-Champaign, Geir Dullerud, University of Illinois at Urbana-Champaign
JournalIEEE Transactions on Network Control Systems
Volume4
Start Page118
Issue1
Date Published03/2017
KeywordsCommunication networks, decision/estimation theory, Differential privacy, distributed algorithms/control, NSA SoS Lablets Materials, science of security, Static-Dynamic Analysis of Security Metrics for Cyber-Physical Systems, UIUC
Abstract

In distributed control systems with shared resources, participating agents can improve the overall performance of the system by sharing data about their personal references. In this paper, we formulate and study a natural tradeoff arising in these problems between the privacy of the agent's data and the performance of the control system.We formalize privacy in terms of differential privacy of agents' preference vectors. The overall control system consists of N agents with linear discrete-time coupled dynamics, each controlled to track its preference vector. Performance of the system is measured by the mean squared tracking error. We present a mechanism that achieves differential privacy by adding Laplace noise to the shared information in a way that depends on the sensitivity of the control system to the private data. We show that for stable systems the performance cost of using this type of privacy preserving mechanism grows as O(T3 /Ne2), where T is the time horizon and e is the privacy parameter. For unstable systems, the cost grows exponentially with time. From an estimation point of view, we establish a lower-bound for the entropy of any unbiased estimator of the private data from any noise-adding mechanism that gives e-differential privacy. We show that the mechanism achieving this lower-bound is a randomized mechanism that also uses Laplace noise.

Citation Keynode-34798
AttachmentSize
bytes