Title | On Cascaded Federated Learning for Multi-Tier Predictive Models |
Publication Type | Conference Paper |
Year of Publication | 2021 |
Authors | Alabbasi, Abdulrahman, Ganjalizadeh, Milad, Vandikas, Konstantinos, Petrova, Marina |
Conference Name | 2021 IEEE International Conference on Communications Workshops (ICC Workshops) |
Date Published | jun |
Keywords | 6G, carrier prediction, Cellular networks, Conferences, federated learning, Metrics, Neural Network, Predictive models, predictive security metrics, privacy, pubcrawl, Quantization (signal), split learning, Urban areas, Wireless communication |
Abstract | The performance prediction of user equipment (UE) metrics has many applications in the 5G era and beyond. For instance, throughput prediction can improve carrier selection, adaptive video streaming's quality of experience (QoE), and traffic latency. Many studies suggest distributed learning algorithms (e.g., federated learning (FL)) for this purpose. However, in a multi-tier design, features are measured in different tiers, e.g., UE tier, and gNodeB (gNB) tier. On one hand, neglecting the measurements in one tier results in inaccurate predictions. On the other hand, transmitting the data from one tier to another improves the prediction performance at the expense of increasing network overhead and privacy risks. In this paper, we propose cascaded FL to enhance UE throughput prediction with minimum network footprint and privacy ramifications (if any). The idea is to introduce feedback to conventional FL, in multi-tier architectures. Although we use cascaded FL for UE prediction tasks, the idea is rather general and can be used for many prediction problems in multi-tier architectures, such as cellular networks. We evaluate the performance of cascaded FL by detailed and 3GPP compliant simulations of London's city center. Our simulations show that the proposed cascaded FL can achieve up to 54% improvement over conventional FL in the normalized gain, at the cost of 1.8 MB (without quantization) and no cost with quantization. |
DOI | 10.1109/ICCWorkshops50388.2021.9473881 |
Citation Key | alabbasi_cascaded_2021 |