Issue on default temperature settings
wcphkust opened this issue · 1 comments
wcphkust commented
Hi, I am using LMFlow to invoke the model codellama/CodeLlama-7b-Instruct-hf. However, I found that the output was very repetitive. It seems that the default temperature setting is current set to 0. It would be better to add an option to support users specifying specific temperatures.
research4pan commented
Thanks for your interest in LMFlow! The commandline argument --temperature 1.0
can be used for this purpose when using examples/chatbot.py
. Also, to reduce repetition, one can adopt --repetition_penalty xxx
for some value >= 1.0. Thanks 😄