Skip to content

Text Compression with Generative Models¤

Coming Soon

This example is planned for a future release. Check back for updates on text compression implementations.

Overview¤

This example will demonstrate:

  • Neural text compression
  • Learned entropy coding
  • VAE-based text compression
  • Rate-distortion optimization

Planned Features¤

  • Latent space text representation
  • Arithmetic coding integration
  • Compression ratio vs. quality trade-offs
  • Streaming compression

References¤

  • Yang et al., "Improving Neural Machine Translation with Conditional Sequence Generative Adversarial Nets" (2018)
  • Townsend et al., "Practical Lossless Compression with Latent Variables Using Bits Back Coding" (2019)