Princeton-SysML/Jailbreak_LLM

For llama2, use_default config seems to have top-k values 50 because of the default value of hugging face config, but did llama2 use top-k?

alongflow opened this issue · 1 comments

Llama2 in hugging face seems to have default top-k values 50, but llama2 paper and its repository shows top-k is likely not used.

Hi,

Thank you for your question! When determining the default values for different hyperparameters, we primarily refer to the default configurations in the model.generate() function for consistency across models. This is because different models may have varying default generation configurations. For instance, the top_k parameter is set to 50 by default, according to the documentation.