arXiv Analytics

Sign in

arXiv:1912.00650 [cs.LG]AbstractReferencesReviewsResources

Stochastic Variational Inference via Upper Bound

Chunlin Ji, Haige Shen

Published 2019-12-02Version 1

Stochastic variational inference (SVI) plays a key role in Bayesian deep learning. Recently various divergences have been proposed to design the surrogate loss for variational inference. We present a simple upper bound of the evidence as the surrogate loss. This evidence upper bound (EUBO) equals to the log marginal likelihood plus the KL-divergence between the posterior and the proposal. We show that the proposed EUBO is tighter than previous upper bounds introduced by $\chi$-divergence or $\alpha$-divergence. To facilitate scalable inference, we present the numerical approximation of the gradient of the EUBO and apply the SGD algorithm to optimize the variational parameters iteratively. Simulation study with Bayesian logistic regression shows that the upper and lower bounds well sandwich the evidence and the proposed upper bound is favorably tight. For Bayesian neural network, the proposed EUBO-VI algorithm outperforms state-of-the-art results for various examples.

Related articles: Most relevant | Search more
arXiv:1707.04025 [cs.LG] (Published 2017-07-13)
On Measuring and Quantifying Performance: Error Rates, Surrogate Loss, and an Example in SSL
arXiv:1804.05981 [cs.LG] (Published 2018-04-16)
A Univariate Bound of Area Under ROC
arXiv:2006.06359 [cs.LG] (Published 2020-06-11)
Improved Algorithms for Convex-Concave Minimax Optimization