New Feature in #NeuralCompression repo: Bits-Back compression for diffusion models!
Compress image data using diffusion models at an effective rate close to the (negative) ELBO.
See: https://github.com/facebookresearch/NeuralCompression/tree/main/projects/bits_back_diffusion
Some context [1/3]
The plot shows the avg. effective compression rate and terms of the (negative) ELBO over the time-steps of the diffusion model for CIFAR-10.
Our implementation supports ImageNet out of the box and can be extended to other datasets and diffusion models!
[2/3]
Bits-Back coding (with asymmetric numeral systems) can be used for lossless compression with latent variable models at a near optimal rate: https://arxiv.org/abs/1901.04866
For extensions to hierarchical latent variable models, such as diffusion models, see: https://arxiv.org/abs/1912.09953 (this is the method we build upon)
https://arxiv.org/abs/1905.06845 (this is the Bit-Swap method used in “Variational Diffusion Models” https://arxiv.org/abs/2107.00630)