arXiv Analytics

Sign in

arXiv:2106.05319 [cs.LG]AbstractReferencesReviewsResources

Stein Latent Optimization for GANs

Uiwon Hwang, Heeseung Kim, Dahuin Jung, Hyemi Jang, Hyungyu Lee, Sungroh Yoon

Published 2021-06-09Version 1

Generative adversarial networks (GANs) with clustered latent spaces can perform conditional generation in a completely unsupervised manner. However, the salient attributes of unlabeled data in the real-world are mostly imbalanced. Existing unsupervised conditional GANs cannot properly cluster the attributes in their latent spaces because they assume uniform distributions of the attributes. To address this problem, we theoretically derive Stein latent optimization that provides reparameterizable gradient estimations of the latent distribution parameters assuming a Gaussian mixture prior in a continuous latent space. Structurally, we introduce an encoder network and a novel contrastive loss to help generated data from a single mixture component to represent a single attribute. We confirm that the proposed method, named Stein Latent Optimization for GANs (SLOGAN), successfully learns the balanced or imbalanced attributes and performs unsupervised tasks such as unsupervised conditional generation, unconditional generation, and cluster assignment even in the absence of information of the attributes (e.g. the imbalance ratio). Moreover, we demonstrate that the attributes to be learned can be manipulated using a small amount of probe data.

Related articles: Most relevant | Search more
arXiv:1901.08479 [cs.LG] (Published 2019-01-24)
On the Transformation of Latent Space in Autoencoders
arXiv:2008.01487 [cs.LG] (Published 2020-08-04)
Faithful Autoencoder Interpolation by Shaping the Latent Space
arXiv:1912.03845 [cs.LG] (Published 2019-12-09)
No Representation without Transformation