arXiv Analytics

Sign in

arXiv:1912.02160 [cs.LG]AbstractReferencesReviewsResources

Informative GANs via Structured Regularization of Optimal Transport

Pierre Bréchet, Tao Wu, Thomas Möllenhoff, Daniel Cremers

Published 2019-12-04Version 1

We tackle the challenge of disentangled representation learning in generative adversarial networks (GANs) from the perspective of regularized optimal transport (OT). Specifically, a smoothed OT loss gives rise to an implicit transportation plan between the latent space and the data space. Based on this theoretical observation, we exploit a structured regularization on the transportation plan to encourage a prescribed latent subspace to be informative. This yields the formulation of a novel informative OT-based GAN. By convex duality, we obtain the equivalent view that this leads to perturbed ground costs favoring sparsity in the informative latent dimensions. Practically, we devise a stable training algorithm for the proposed informative GAN. Our experiments support the hypothesis that such regularizations effectively yield the discovery of disentangled and interpretable latent representations. Our work showcases potential power of a regularized OT framework in the context of generative modeling through its access to the transport plan. Further challenges are addressed in this line.

Comments: Presented at the Optimal Transport and Machine Learning Workshop, NeurIPS 2019
Categories: cs.LG, stat.ML
Related articles: Most relevant | Search more
arXiv:1911.02053 [cs.LG] (Published 2019-11-05)
Alleviating Label Switching with Optimal Transport
arXiv:1911.02536 [cs.LG] (Published 2019-11-06)
Unsupervised Hierarchy Matching with Optimal Transport over Hyperbolic Spaces
arXiv:1906.09218 [cs.LG] (Published 2019-06-21)
FlipTest: Fairness Auditing via Optimal Transport