facebookresearch/fairseq

calculation of the perplexity score

Opened this issue · 0 comments

❓ Questions and Help

Before asking:

  1. search the issues. → couldn't find an answer
  2. search the docs. → couldn't find an answer

What is your question?

Why is the perplexity score calculated as (2**avg_nll_loss) instead of the regular exp(avg_nll_loss)

"perplexity": 2**avg_nll_loss,

Was this a deliberate choice made by the fairseq team? or is there some other reason behind it?

cc @b-dickson @zorant

What's your environment?

  • fairseq Version (e.g., 1.0 or main): 0.12.2
  • PyTorch Version (e.g., 1.0)
  • OS (e.g., Linux):
  • How you installed fairseq (pip, source):
  • Build command you used (if compiling from source):
  • Python version:
  • CUDA/cuDNN version:
  • GPU models and configuration:
  • Any other relevant information: