Biblio

Filters: Author is Zhang, Maojun  [Clear All Filters]
2022-02-24
Zhang, Maojun, Zhu, Guangxu, Wang, Shuai, Jiang, Jiamo, Zhong, Caijun, Cui, Shuguang.  2021.  Accelerating Federated Edge Learning via Optimized Probabilistic Device Scheduling. 2021 IEEE 22nd International Workshop on Signal Processing Advances in Wireless Communications (SPAWC). :606–610.
The popular federated edge learning (FEEL) framework allows privacy-preserving collaborative model training via frequent learning-updates exchange between edge devices and server. Due to the constrained bandwidth, only a subset of devices can upload their updates at each communication round. This has led to an active research area in FEEL studying the optimal device scheduling policy for minimizing communication time. However, owing to the difficulty in quantifying the exact communication time, prior work in this area can only tackle the problem partially by considering either the communication rounds or per-round latency, while the total communication time is determined by both metrics. To close this gap, we make the first attempt in this paper to formulate and solve the communication time minimization problem. We first derive a tight bound to approximate the communication time through cross-disciplinary effort involving both learning theory for convergence analysis and communication theory for per-round latency analysis. Building on the analytical result, an optimized probabilistic scheduling policy is derived in closed-form by solving the approximate communication time minimization problem. It is found that the optimized policy gradually turns its priority from suppressing the remaining communication rounds to reducing per-round latency as the training process evolves. The effectiveness of the proposed scheme is demonstrated via a use case on collaborative 3D objective detection in autonomous driving.