Visible to the public An Approach for Peer-to-Peer Federated Learning

TitleAn Approach for Peer-to-Peer Federated Learning
Publication TypeConference Paper
Year of Publication2021
AuthorsWink, Tobias, Nochta, Zoltan
Conference Name2021 51st Annual IEEE/IFIP International Conference on Dependable Systems and Networks Workshops (DSN-W)
KeywordsCollaboration, data protection, Deep Learning, federated learning, human factors, machine learning, Metrics, Mission critical systems, Neural networks, peer to peer security, peer-to-peer security, privacy, pubcrawl, resilience, Resiliency, Scalability, security, Stochastic processes, Training
AbstractWe present a novel approach for the collaborative training of neural network models in decentralized federated environments. In the iterative process a group of autonomous peers run multiple training rounds to train a common model. Thereby, participants perform all model training steps locally, such as stochastic gradient descent optimization, using their private, e.g. mission-critical, training datasets. Based on locally updated models, participants can jointly determine a common model by averaging all associated model weights without sharing the actual weight values. For this purpose we introduce a simple n-out-of-n secret sharing schema and an algorithm to calculate average values in a peer-to-peer manner. Our experimental results with deep neural networks on well-known sample datasets prove the generic applicability of the approach, with regard to model quality parameters. Since there is no need to involve a central service provider in model training, the approach can help establish trustworthy collaboration platforms for businesses with high security and data protection requirements.
DOI10.1109/DSN-W52860.2021.00034
Citation Keywink_approach_2021