marella/chatdocs

LLama 2 and Code LLama support?

Opened this issue · 2 comments

I have been trying to get llama 2 models to function correctly. They start off ok but then all of them goes into a loop with repetitions or gibberish.

I haven't tried setting model_type:llama to something else, could it be that we need to add llama2 here instead?

model_type: llama

Possible to get any of the code llms to support this ?

I tried with llama-2 and llama2 and read the ctransformers documentation and realized its just llama.

The answer gets into a loop when using llama2 models:

The telecom industry is not not not not not not not not not not not not not not not

Like that, I read somewhere that it could be related to something RoPE but don't know how to set that!

Fixed it by implementing prompt template!