arXiv Analytics

Sign in

arXiv:2103.11619 [cs.LG]AbstractReferencesReviewsResources

Server Averaging for Federated Learning

George Pu, Yanlin Zhou, Dapeng Wu, Xiaolin Li

Published 2021-03-22Version 1

Federated learning allows distributed devices to collectively train a model without sharing or disclosing the local dataset with a central server. The global model is optimized by training and averaging the model parameters of all local participants. However, the improved privacy of federated learning also introduces challenges including higher computation and communication costs. In particular, federated learning converges slower than centralized training. We propose the server averaging algorithm to accelerate convergence. Sever averaging constructs the shared global model by periodically averaging a set of previous global models. Our experiments indicate that server averaging not only converges faster, to a target accuracy, than federated averaging (FedAvg), but also reduces the computation costs on the client-level through epoch decay.

Related articles: Most relevant | Search more
arXiv:2208.01901 [cs.LG] (Published 2022-08-03)
Asynchronous Federated Learning for Edge-assisted Vehicular Networks
arXiv:2305.07845 [cs.LG] (Published 2023-05-13)
Understanding Model Averaging in Federated Learning on Heterogeneous Data
arXiv:2012.10936 [cs.LG] (Published 2020-12-20)
Toward Understanding the Influence of Individual Clients in Federated Learning