arXiv Analytics

Sign in

arXiv:1901.04866 [cs.LG]AbstractReferencesReviewsResources

Practical Lossless Compression with Latent Variables using Bits Back Coding

James Townsend, Tom Bird, David Barber

Published 2019-01-15Version 1

Deep latent variable models have seen recent success in many data domains. Lossless compression is an application of these models which, despite having the potential to be highly useful, has yet to be implemented in a practical manner. We present `Bits Back with ANS' (BB-ANS), a scheme to perform lossless compression with latent variable models at a near optimal rate. We demonstrate this scheme by using it to compress the MNIST dataset with a variational auto-encoder model (VAE), achieving compression rates superior to standard methods with only a simple VAE. Given that the scheme is highly amenable to parallelization, we conclude that with a sufficiently high quality generative model this scheme could be used to achieve substantial improvements in compression rate with acceptable running time. We make our implementation available open source at https://github.com/bits-back/bits-back .

Related articles: Most relevant | Search more
arXiv:2009.03034 [cs.LG] (Published 2020-09-07)
Ordinal-Content VAE: Isolating Ordinal-Valued Content Factors in Deep Latent Variable Models
arXiv:2102.06648 [cs.LG] (Published 2021-02-12)
A Critical Look At The Identifiability of Causal Effects with Deep Latent Variable Models
arXiv:2212.08765 [cs.LG] (Published 2022-12-17)
Latent Variable Representation for Reinforcement Learning