Visible to the public Biblio

Filters: Keyword is convergence analysis  [Clear All Filters]
2021-01-11
Li, Y., Chang, T.-H., Chi, C.-Y..  2020.  Secure Federated Averaging Algorithm with Differential Privacy. 2020 IEEE 30th International Workshop on Machine Learning for Signal Processing (MLSP). :1–6.
Federated learning (FL), as a recent advance of distributed machine learning, is capable of learning a model over the network without directly accessing the client's raw data. Nevertheless, the clients' sensitive information can still be exposed to adversaries via differential attacks on messages exchanged between the parameter server and clients. In this paper, we consider the widely used federating averaging (FedAvg) algorithm and propose to enhance the data privacy by the differential privacy (DP) technique, which obfuscates the exchanged messages by properly adding Gaussian noise. We analytically show that the proposed secure FedAvg algorithm maintains an O(l/T) convergence rate, where T is the total number of stochastic gradient descent (SGD) updates for local model parameters. Moreover, we demonstrate how various algorithm parameters can impact on the algorithm communication efficiency. Experiment results are presented to justify the obtained analytical results on the performance of the proposed algorithm in terms of testing accuracy.
2018-09-28
Wei, P., Xia, B., Luo, X..  2017.  Parameter estimation and convergence analysis for a class of canonical dynamic systems by extended kalman filter. 2017 3rd IEEE International Conference on Control Science and Systems Engineering (ICCSSE). :336–340.

There were many researches about the parameter estimation of canonical dynamic systems recently. Extended Kalman filter (EKF) is a popular parameter estimation method in virtue of its easy applications. This paper focuses on parameter estimation for a class of canonical dynamic systems by EKF. By constructing associated differential equation, the convergence of EKF parameter estimation for the canonical dynamic systems is analyzed. And the simulation demonstrates the good performance.

2017-10-10
Yuan, Ganzhao, Yang, Yin, Zhang, Zhenjie, Hao, Zhifeng.  2016.  Convex Optimization for Linear Query Processing Under Approximate Differential Privacy. Proceedings of the 22Nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. :2005–2014.

Differential privacy enables organizations to collect accurate aggregates over sensitive data with strong, rigorous guarantees on individuals' privacy. Previous work has found that under differential privacy, computing multiple correlated aggregates as a batch, using an appropriate strategy, may yield higher accuracy than computing each of them independently. However, finding the best strategy that maximizes result accuracy is non-trivial, as it involves solving a complex constrained optimization program that appears to be non-convex. Hence, in the past much effort has been devoted in solving this non-convex optimization program. Existing approaches include various sophisticated heuristics and expensive numerical solutions. None of them, however, guarantees to find the optimal solution of this optimization problem. This paper points out that under (ε, ཬ)-differential privacy, the optimal solution of the above constrained optimization problem in search of a suitable strategy can be found, rather surprisingly, by solving a simple and elegant convex optimization program. Then, we propose an efficient algorithm based on Newton's method, which we prove to always converge to the optimal solution with linear global convergence rate and quadratic local convergence rate. Empirical evaluations demonstrate the accuracy and efficiency of the proposed solution.