Autoregressive Text Generation¤
Coming Soon
This example is planned for a future release. Check back for updates on autoregressive text generation.
Overview¤
This example will demonstrate:
- Character-level language modeling
- Word-level autoregressive generation
- RNN and LSTM text generation
- Temperature-based sampling
Planned Features¤
- Simple character-level model
- Vocabulary and tokenization
- Training on text corpora
- Interactive text generation
Related Documentation¤
References¤
- Graves, "Generating Sequences With Recurrent Neural Networks" (2013)
- Bengio et al., "A Neural Probabilistic Language Model" (2003)