arXiv Analytics

Sign in

arXiv:2008.04489 [cs.LG]AbstractReferencesReviewsResources

Federated Learning via Synthetic Data

Jack Goetz, Ambuj Tewari

Published 2020-08-11Version 1

Federated learning allows for the training of a model using data on multiple clients without the clients transmitting that raw data. However the standard method is to transmit model parameters (or updates), which for modern neural networks can be on the scale of millions of parameters, inflicting significant computational costs on the clients. We propose a method for federated learning where instead of transmitting a gradient update back to the server, we instead transmit a small amount of synthetic `data'. We describe the procedure and show some experimental results suggesting this procedure has potential, providing more than an order of magnitude reduction in communication costs with minimal model degradation.

Related articles: Most relevant | Search more
arXiv:1806.00582 [cs.LG] (Published 2018-06-02)
Federated Learning with Non-IID Data
arXiv:2006.11901 [cs.LG] (Published 2020-06-21)
Free-rider Attacks on Model Aggregation in Federated Learning
arXiv:2002.10619 [cs.LG] (Published 2020-02-25)
Three Approaches for Personalization with Applications to Federated Learning