Biblio

Filters: Author is E. Nozari  [Clear All Filters]
2018-05-16
E. Nozari, Y. Zhao, J. Cortes.  2018.  Network identification with latent nodes via auto-regressive models. tcns.

We consider linear time-invariant networks with unknown interaction topology where only a subset of the nodes, termed manifest, can be directly controlled and observed. The remaining nodes are termed latent and their number is also unknown. Our goal is to identify the transfer function of the manifest subnetwork and determine whether interactions between manifest nodes are direct or mediated by latent nodes. We show that, if there are no inputs to the latent nodes, then the manifest transfer function can be approximated arbitrarily well in the $H_ınfty}$-norm sense by the transfer function of an auto-regressive model. Motivated by this result, we present a least-squares estimation method to construct the auto-regressive model from measured data. We establish that the least-squares matrix estimate converges in probability to the matrix sequence defining the desired auto-regressive model as the length of data and the model order grow. We also show that the least-squares auto-regressive method guarantees an arbitrarily small $H_ınfty$-norm error in the approximation of the manifest transfer function, exponentially decaying once the model order exceeds a certain threshold. Finally, we show that when the latent subnetwork is acyclic, the proposed method achieves perfect identification of the manifest transfer function above a specific model order as the length of the data increases. Various examples illustrate our results.

To appear

E. Nozari, P. Tallapragada, J. Cortes.  2017.  Differentially private average consensus: obstructions, trade-offs, and optimal algorithm design. automatica. 81:221-231.

This paper studies the multi-agent average consensus problem under the requirement of differential privacy of the agents' initial states against an adversary that has access to all messages. As a fundamental limitation, we first establish that a differentially private consensus algorithm cannot guarantee convergence of the agents' states to the exact average in distribution, which in turn implies the same impossibility for other stronger notions of convergence. This result motives our design of a novel differentially private Laplacian consensus algorithm in which agents linearly perturb their state-transition and message-generating functions with exponentially decaying Laplace noise. We prove that our algorithm converges almost surely to an unbiased estimate of the average of the agents' initial states, compute the exponential mean-square rate of convergence, and formally characterize its differential privacy properties. Furthermore, we also find explicit optimal values of the design parameters that minimize the variance of the algorithm's convergence point around the exact average. Various simulations illustrate our results.

E. Nozari, P. Tallapragada, J. Cortes.  2017.  Differentially Private Distributed Convex Optimization via Functional Perturbation.

We study a class of distributed convex constrained optimization problem where a group of agents aims to minimize the sum of individual objective functions while each desires to keep its function differentially private. We prove the impossibility of achieving differential privacy using strategies based on perturbing with noise the inter-agent messages when the underlying noise-free dynamics is asymptotically stable. This justifies our algorithmic solution based on the perturbation of the individual objective functions with Laplace noise within the framework of functional differential privacy. We carefully design post-processing steps that ensure the perturbed functions regain the smoothness and convexity properties of the original functions while preserving the differentially private guarantees of the functional perturbation step. This methodology allows to use any distributed coordination algorithm to solve the optimization problem on the noisy functions. Finally, we explicitly bound the magnitude of the expected distance between the perturbed and true optimizers, and characterize the privacy-accuracy trade-off. Simulations illustrate our results.

To appear

2018-05-15
2018-05-16
E. Nozari, P. Tallapragada, J. Cortes.  2015.  Differentially private average consensus with optimal noise selection. IFAC-PapersOnLine. 48:203-208.

This paper studies the problem of privacy-preserving average consensus in multi-agent systems. The network objective is to compute the average of the initial agent states while keeping these values differentially private against an adversary that has access to all inter-agent messages. We establish an impossibility result that shows that exact average consensus cannot be achieved by any algorithm that preserves differential privacy. This result motives our design of a differentially private discrete-time distributed algorithm that corrupts messages with Laplacian noise and is guaranteed to achieve average consensus in expectation. We examine how to optimally select the noise parameters in order to minimize the variance of the network convergence point for a desired level of privacy.

it IFAC Workshop on Distributed Estimation and Control in Networked Systems}, Philadelphia, PA