arXiv Analytics

Sign in

arXiv:1802.04826 [stat.ML]AbstractReferencesReviewsResources

Leveraging the Exact Likelihood of Deep Latent Variables Models

Pierre-Alexandre Mattei, Jes Frellsen

Published 2018-02-13Version 1

Deep latent variable models combine the approximation abilities of deep neural networks and the statistical foundations of generative models. The induced data distribution is an infinite mixture model whose density is extremely delicate to compute. Variational methods are consequently used for inference, following the seminal work of Rezende et al. (2014) and Kingma and Welling (2014). We study the well-posedness of the exact problem (maximum likelihood) these techniques approximatively solve. In particular, we show that most unconstrained models used for continuous data have an unbounded likelihood. This ill-posedness and the problems it causes are illustrated on real data. We also show how to insure the existence of maximum likelihood estimates, and draw useful connections with nonparametric mixture models. Furthermore, we describe an algorithm that allows to perform missing data imputation using the exact conditional likelihood of a deep latent variable model. On several real data sets, our algorithm consistently and significantly outperforms the usual imputation scheme used within deep latent variable models.

Related articles: Most relevant | Search more
arXiv:2407.13310 [stat.ML] (Published 2024-07-18)
A deep latent variable model for semi-supervised multi-unit soft sensing in industrial processes
arXiv:2005.05210 [stat.ML] (Published 2020-05-11)
Deep Latent Variable Model for Longitudinal Group Factor Analysis
arXiv:2306.10943 [stat.ML] (Published 2023-06-19)
Probabilistic matching of real and generated data statistics in generative adversarial networks