arXiv Analytics

Sign in

arXiv:2001.04974 [cs.LG]AbstractReferencesReviewsResources

Noisy Machines: Understanding Noisy Neural Networks and Enhancing Robustness to Analog Hardware Errors Using Distillation

Chuteng Zhou, Prad Kadambi, Matthew Mattina, Paul N. Whatmough

Published 2020-01-14Version 1

The success of deep learning has brought forth a wave of interest in computer hardware design to better meet the high demands of neural network inference. In particular, analog computing hardware has been heavily motivated specifically for accelerating neural networks, based on either electronic, optical or photonic devices, which may well achieve lower power consumption than conventional digital electronics. However, these proposed analog accelerators suffer from the intrinsic noise generated by their physical components, which makes it challenging to achieve high accuracy on deep neural networks. Hence, for successful deployment on analog accelerators, it is essential to be able to train deep neural networks to be robust to random continuous noise in the network weights, which is a somewhat new challenge in machine learning. In this paper, we advance the understanding of noisy neural networks. We outline how a noisy neural network has reduced learning capacity as a result of loss of mutual information between its input and output. To combat this, we propose using knowledge distillation combined with noise injection during training to achieve more noise robust networks, which is demonstrated experimentally across different networks and datasets, including ImageNet. Our method achieves models with as much as two times greater noise tolerance compared with the previous best attempts, which is a significant step towards making analog hardware practical for deep learning.

Related articles:
arXiv:1903.03694 [cs.LG] (Published 2019-03-08)
Everything old is new again: A multi-view learning approach to learning using privileged information and distillation
arXiv:1811.03233 [cs.LG] (Published 2018-11-08)
Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons