Visible to the public Biblio

Filters: Author is Su, Cheng  [Clear All Filters]
2020-07-03
Bao, Xianglin, Su, Cheng, Xiong, Yan, Huang, Wenchao, Hu, Yifei.  2019.  FLChain: A Blockchain for Auditable Federated Learning with Trust and Incentive. 2019 5th International Conference on Big Data Computing and Communications (BIGCOM). :151—159.

Federated learning (shorted as FL) recently proposed by Google is a privacy-preserving method to integrate distributed data trainers. FL is extremely useful due to its ensuring privacy, lower latency, less power consumption and smarter models, but it could fail if multiple trainers abort training or send malformed messages to its partners. Such misbehavior are not auditable and parameter server may compute incorrectly due to single point failure. Furthermore, FL has no incentive to attract sufficient distributed training data and computation power. In this paper, we propose FLChain to build a decentralized, public auditable and healthy FL ecosystem with trust and incentive. FLChain replace traditional FL parameter server whose computation result must be consensual on-chain. Our work is not trivial when it is vital and hard to provide enough incentive and deterrence to distributed trainers. We achieve model commercialization by providing a healthy marketplace for collaborative-training models. Honest trainer can gain fairly partitioned profit from well-trained model according to its contribution and the malicious can be timely detected and heavily punished. To reduce the time cost of misbehavior detecting and model query, we design DDCBF for accelerating the query of blockchain-documented information. Finally, we implement a prototype of our work and measure the cost of various operations.