arXiv Analytics

Sign in

arXiv:1505.05424 [stat.ML]AbstractReferencesReviewsResources

Weight Uncertainty in Neural Networks

Charles Blundell, Julien Cornebise, Koray Kavukcuoglu, Daan Wierstra

Published 2015-05-20Version 1

We introduce a new, efficient, principled and backpropagation-compatible algorithm for learning a probability distribution on the weights of a neural network, called Bayes by Backprop. It regularises the weights by minimising a compression cost, known as the variational free energy or the expected lower bound on the marginal likelihood. We show that this principled kind of regularisation yields comparable performance to dropout on MNIST classification. We then demonstrate how the learnt uncertainty in the weights can be used to improve generalisation in non-linear regression problems, and how this weight uncertainty can be used to drive the exploration-exploitation trade-off in reinforcement learning.

Comments: In Proceedings of the 32nd International Conference on Machine Learning (ICML 2015)
Categories: stat.ML, cs.LG
Related articles: Most relevant | Search more
arXiv:2309.03770 [stat.ML] (Published 2023-09-07)
Neural lasso: a unifying approach of lasso and neural networks
arXiv:2007.13218 [stat.ML] (Published 2020-07-26)
DeepHazard: neural network for time-varying risks
arXiv:2208.05776 [stat.ML] (Published 2022-08-10)
Neural Networks for Scalar Input and Functional Output