ggerganov/llama.cpp

Text Generation task

rexionmars opened this issue · 2 comments

How to use the llama.cpp for text generation instead of completing the provided text?
Screenshot_20240513_103307

Screenshot_20240513_103253

You need to use an instruct or chat tuned model. If you use a base model the output will be text completion/continuation.

thanks