Permit setting default promt format and max tokens in models tab
Opened this issue · 0 comments
dagbdagb commented
When loading a model, permit setting the default 'Prompt Format' and 'Max Tokens for the model in question', so we don't have to set these for every new session. Is it possible to auto-detect or use some kind of heuristics to find the best template?
I notice Llama 3 instruct doesn't work as well with the default 'Chat RP' prompt. Every time. :-)