arXiv Analytics

Sign in

arXiv:1911.04559 [cs.LG]AbstractReferencesReviewsResources

Privacy is What We Care About: Experimental Investigation of Federated Learning on Edge Devices

Anirban Das, Thomas Brunschwiler

Published 2019-11-11Version 1

Federated Learning enables training of a general model through edge devices without sending raw data to the cloud. Hence, this approach is attractive for digital health applications, where data is sourced through edge devices and users care about privacy. Here, we report on the feasibility to train deep neural networks on the Raspberry Pi4s as edge devices. A CNN, a LSTM and a MLP were successfully trained on the MNIST data-set. Further, federated learning is demonstrated experimentally on IID and non-IID samples in a parametric study, to benchmark the model convergence. The weight updates from the workers are shared with the cloud to train the general model through federated learning. With the CNN and the non-IID samples a test-accuracy of up to 85% could be achieved within a training time of 2 minutes, while exchanging less than $10$ MB data per device. In addition, we discuss federated learning from an use-case standpoint, elaborating on privacy risks and labeling requirements for the application of emotion detection from sound. Based on the experimental findings, we discuss possible research directions to improve model and system performance. Finally, we provide best practices for a practitioner, considering the implementation of federated learning.

Comments: Accepted in ACM AIChallengeIoT 2019, New York, USA
Categories: cs.LG, cs.DC, stat.ML
Related articles: Most relevant | Search more
arXiv:1908.07873 [cs.LG] (Published 2019-08-21)
Federated Learning: Challenges, Methods, and Future Directions
arXiv:2001.01523 [cs.LG] (Published 2020-01-06)
Think Locally, Act Globally: Federated Learning with Local and Global Representations
arXiv:1902.01046 [cs.LG] (Published 2019-02-04)
Towards Federated Learning at Scale: System Design