Shouldn't perplexity range from [1 to inf)?
ivanmkc opened this issue · 2 comments
ivanmkc commented
evaluate/metrics/perplexity/README.md
Line 60 in 8dfe057
perplexity = e**(sum(losses) / num_tokenized_tokens)
If sum(losses) = 0, then perplexity = 1.
simpleParadox commented
@ivanmkc
Perplexity ranges between zero and inf because the exponent can be negative (The sum of negative log likelihoods).
Check out the following blog post for a better understanding.
Perplexity of fixed-length models.
ivanmkc commented
Thanks, will take a look.