arXiv Analytics

Sign in

arXiv:1811.03604 [cs.CL]AbstractReferencesReviewsResources

Federated Learning for Mobile Keyboard Prediction

Andrew Hard, Kanishka Rao, Rajiv Mathews, Françoise Beaufays, Sean Augenstein, Hubert Eichner, Chloé Kiddon, Daniel Ramage

Published 2018-11-08Version 1

We train a recurrent neural network language model using a distributed, on-device learning framework called federated learning for the purpose of next-word prediction in a virtual keyboard for smartphones. Server-based training using stochastic gradient descent is compared with training on client devices using the Federated Averaging algorithm. The federated algorithm, which enables training on a higher-quality dataset for this use case, is shown to achieve better prediction recall. This work demonstrates the feasibility and benefit of training language models on client devices without exporting sensitive user data to servers. The federated learning environment gives users greater control over their data and simplifies the task of incorporating privacy by default with distributed training and aggregation across a population of client devices.

Related articles: Most relevant | Search more
arXiv:1910.03432 [cs.CL] (Published 2019-10-08)
Federated Learning of N-gram Language Models
arXiv:1903.10635 [cs.CL] (Published 2019-03-26)
Federated Learning Of Out-Of-Vocabulary Words
arXiv:2007.11794 [cs.CL] (Published 2020-07-23)
Applying GPGPU to Recurrent Neural Network Language Model based Fast Network Search in the Real-Time LVCSR