Skip to content

Perplexity¤

Module: generative_models.core.evaluation.metrics.text.perplexity

Source: generative_models/core/evaluation/metrics/text/perplexity.py

Overview¤

Perplexity metric implementation using JAX and NNX.

Perplexity is a common evaluation metric for language models that measures how well a probability model predicts a sample. Lower perplexity indicates better prediction performance.

Classes¤

Perplexity¤

class Perplexity

Functions¤

init¤

def __init__()

calculate_perplexity¤

def calculate_perplexity()

compute¤

def compute()

compute_log_probs¤

def compute_log_probs()

Module Statistics¤

  • Classes: 1
  • Functions: 4
  • Imports: 5