Perplexity¤
Module: generative_models.core.evaluation.metrics.text.perplexity
Source: generative_models/core/evaluation/metrics/text/perplexity.py
Overview¤
Perplexity metric implementation using JAX and NNX.
Perplexity is a common evaluation metric for language models that measures how well a probability model predicts a sample. Lower perplexity indicates better prediction performance.
Classes¤
Perplexity¤
Functions¤
init¤
calculate_perplexity¤
compute¤
compute_log_probs¤
Module Statistics¤
- Classes: 1
- Functions: 4
- Imports: 5