arXiv Analytics

Sign in

arXiv:1907.10477 [stat.ML]AbstractReferencesReviewsResources

On the relationship between variational inference and adaptive importance sampling

Axel Finke, Alexandre H. Thiery

Published 2019-07-24Version 1

The importance weighted autoencoder (IWAE) (Burda et al., 2016) and reweighted wake-sleep (RWS) algorithm (Bornschein and Bengio, 2015) are popular approaches which employ multiple samples to achieve bias reductions compared to standard variational methods. However, their relationship has hitherto been unclear. We introduce a simple, unified framework for multi-sample variational inference termed adaptive importance sampling for learning (AISLE) and show that it admits IWAE and RWS as special cases. Through a principled application of a variance-reduction technique from Tucker et al. (2019), we also show that the sticking-the-landing (STL) gradient from Roeder et al. (2017), which previously lacked theoretical justification, can be recovered as a special case of RWS (and hence of AISLE). In particular, this indicates that the breakdown of RWS -- but not of STL -- observed in Tucker et al. (2019) may not be attributable to the lack of a joint objective for the generative-model and inference-network parameters as previously conjectured. Finally, we argue that our adaptive-importance-sampling interpretation of variational inference leads to more natural and principled extensions to sequential Monte Carlo methods than the IWAE-type multi-sample objective interpretation.

Related articles: Most relevant | Search more
arXiv:2305.17225 [stat.ML] (Published 2023-05-26)
Causal Component Analysis
arXiv:1611.01353 [stat.ML] (Published 2016-11-04)
Information Dropout: learning optimal representations through noise
arXiv:1508.06091 [stat.ML] (Published 2015-08-25)
AUC Optimisation and Collaborative Filtering