LLM error ['stop']
gdnaesver opened this issue · 1 comments
gdnaesver commented
Across multiple models (for example WizardLM/WizardCoder-1B-V1.0), both inference, using the model name and with my own depoyment I get the error:
[LLM] The following 'model_kwargs' are not used by the model: ['stop'] (note typos in the generate argument will also show up in this list)
I can't work out what to do to fix this....
gdnaesver commented
wrong repo... sorry