`run_generate_text` ignores `input_prefix` & `input_suffix`
Closed this issue · 3 comments
It seems like run_generate_text
ignore the input_prefix
and input_suffix
.
I've prepared the following sample project. (LLama 8B instruct not included)
There are two buttons, but for seperate GDLlama instances. The first one has the properties set that should force him to act like a mean assistant, but the instance answers like normal. The second instance just takes the two properties of the first one and concatinates them into the prompt and it works.
Therefor I conclude that these properties are ignored.
Yes you are right. I follow the original design of llama.cpp and the prefix and suffix are only activated for interactive mode (i.e., with Instruct
or Interactive
Enable) because it makes little sense to add prefix and suffix in a simple text generation in llama.cpp - just add that to the prompt.
It is confusing here in this plugin though, I will append the pre and suffix to the prompt in the next version
Thanks, I think that would be good. Another option would be to somehow name the Parameters accordingly, so that the user sees that they are for Interactive mode only.
Should be fixed in v0.4