arXiv Analytics

Sign in

arXiv:2309.13643 [cs.LG]AbstractReferencesReviewsResources

REWAFL: Residual Energy and Wireless Aware Participant Selection for Efficient Federated Learning over Mobile Devices

Y. Li, X. Qin, J. Geng, R. Chen, Y. Hou, Y. Gong, M. Pan, P. Zhang

Published 2023-09-24Version 1

Participant selection (PS) helps to accelerate federated learning (FL) convergence, which is essential for the practical deployment of FL over mobile devices. While most existing PS approaches focus on improving training accuracy and efficiency rather than residual energy of mobile devices, which fundamentally determines whether the selected devices can participate. Meanwhile, the impacts of mobile devices' heterogeneous wireless transmission rates on PS and FL training efficiency are largely ignored. Moreover, PS causes the staleness issue. Prior research exploits isolated functions to force long-neglected devices to participate, which is decoupled from original PS designs. In this paper, we propose a residual energy and wireless aware PS design for efficient FL training over mobile devices (REWAFL). REW AFL introduces a novel PS utility function that jointly considers global FL training utilities and local energy utility, which integrates energy consumption and residual battery energy of candidate mobile devices. Under the proposed PS utility function framework, REW AFL further presents a residual energy and wireless aware local computing policy. Besides, REWAFL buries the staleness solution into its utility function and local computing policy. The experimental results show that REW AFL is effective in improving training accuracy and efficiency, while avoiding "flat battery" of mobile devices.

Related articles: Most relevant | Search more
arXiv:2101.04866 [cs.LG] (Published 2021-01-13)
Towards Energy Efficient Federated Learning over 5G+ Mobile Devices
arXiv:2201.01601 [cs.LG] (Published 2022-01-05, updated 2022-05-26)
FedBalancer: Data and Pace Control for Efficient Federated Learning on Heterogeneous Clients
arXiv:2306.11426 [cs.LG] (Published 2023-06-20)
Exploring the Performance and Efficiency of Transformer Models for NLP on Mobile Devices