arXiv:2107.02732 [cs.LG]AbstractReferencesReviewsResources
Provable Lipschitz Certification for Generative Models
Matt Jordan, Alexandros G. Dimakis
Published 2021-07-06Version 1
We present a scalable technique for upper bounding the Lipschitz constant of generative models. We relate this quantity to the maximal norm over the set of attainable vector-Jacobian products of a given generative model. We approximate this set by layerwise convex approximations using zonotopes. Our approach generalizes and improves upon prior work using zonotope transformers and we extend to Lipschitz estimation of neural networks with large output dimension. This provides efficient and tight bounds on small networks and can scale to generative models on VAE and DCGAN architectures.
Comments: Accepted into ICML 2021
Related articles: Most relevant | Search more
arXiv:2003.11399 [cs.LG] (Published 2020-03-25)
Discriminative Viewer Identification using Generative Models of Eye Gaze
arXiv:1907.05600 [cs.LG] (Published 2019-07-12)
Generative Modeling by Estimating Gradients of the Data Distribution
arXiv:1905.09894 [cs.LG] (Published 2019-05-23)
PHom-GeM: Persistent Homology for Generative Models