arXiv Analytics

Sign in

arXiv:1811.03604 [cs.CL]AbstractReferencesReviewsResources

Federated Learning for Mobile Keyboard Prediction

Andrew Hard, Kanishka Rao, Rajiv Mathews, Françoise Beaufays, Sean Augenstein, Hubert Eichner, Chloé Kiddon, Daniel Ramage

Published 2018-11-08Version 1

We train a recurrent neural network language model using a distributed, on-device learning framework called federated learning for the purpose of next-word prediction in a virtual keyboard for smartphones. Server-based training using stochastic gradient descent is compared with training on client devices using the Federated Averaging algorithm. The federated algorithm, which enables training on a higher-quality dataset for this use case, is shown to achieve better prediction recall. This work demonstrates the feasibility and benefit of training language models on client devices without exporting sensitive user data to servers. The federated learning environment gives users greater control over their data and simplifies the task of incorporating privacy by default with distributed training and aggregation across a population of client devices.

Related articles: Most relevant | Search more
arXiv:1506.01192 [cs.CL] (Published 2015-06-03)
Personalizing a Universal Recurrent Neural Network Language Model with User Characteristic Features by Crowdsouring over Social Networks
arXiv:1611.00196 [cs.CL] (Published 2016-11-01)
Recurrent Neural Network Language Model Adaptation Derived Document Vector
arXiv:1801.09866 [cs.CL] (Published 2018-01-30)
Accelerating recurrent neural network language model based online speech recognition system