After the new version all local models are unavailable, what ever you ask it will talk about a different topic and is completely uncommunicative.
ggiidd opened this issue · 2 comments
ggiidd commented
danemadsen commented
Ill look into it. It must be an issue with llama_apply_chat_template
danemadsen commented
@ggiidd I dug into the code and it seems the gibberish is occuring because if nPredict is more than or equal to nCtx all the data in the preprompt gets dropped. I can fix this in the next release but in the meantime you should make sure nCtx is set to a value much higher than nPredict