arXiv Analytics

Sign in

arXiv:2303.14430 [cs.LG]AbstractReferencesReviewsResources

Beta-VAE has 2 Behaviors: PCA or ICA?

Zhouzheng Li, Hao Liu

Published 2023-03-25Version 1

Beta-VAE is a very classical model for disentangled representation learning, the use of an expanding bottleneck that allow information into the decoder gradually is key to representation disentanglement as well as high-quality reconstruction. During recent experiments on such fascinating structure, we discovered that the total amount of latent variables can affect the representation learnt by the network: with very few latent variables, the network tend to learn the most important or principal variables, acting like a PCA; with very large numbers of latent variables, the variables tend to be more disentangled, and act like an ICA. Our assumption is that the competition between latent variables while trying to gain the most information bandwidth can lead to this phenomenon.

Related articles: Most relevant | Search more
arXiv:2405.16225 [cs.LG] (Published 2024-05-25, updated 2024-06-06)
Local Causal Structure Learning in the Presence of Latent Variables
arXiv:1902.01388 [cs.LG] (Published 2019-02-04)
Re-examination of the Role of Latent Variables in Sequence Modeling
arXiv:2102.03129 [cs.LG] (Published 2021-02-05)
Integer Programming for Causal Structure Learning in the Presence of Latent Variables