nomic-ai/pygpt4all

Interactive communication

ParisNeo opened this issue · 0 comments

Hi there.
I am developing the official webui for GPT4All and I have a problem with the interactive mode:
1- The generations seems to be independent and cannot be used for real chatbot
2- The tool is generating both the prompt and the output so I had to put a hack to make it work

I need to be able to condition the chatbot before starting the discussion. And need the chatbot to keep the context and remember what I saif before between messages.

Here for example my conditionning code.
conditionning_message = """
Instruction: Act as GPT4All. A kind and helpful AI bot built to help users solve problems.
Start by welcoming the user then stop sending text.
GPT4All:Welcome! I'm here to assist you with anything you need. What can I do for you today?"""

Here I don't want the model to answer, it only needs to add the conditioning to its context. I would put n_predict=0, but i need to put n_predict=len(conditionning_message) so that the model takes it all.

self.chatbot_bindings.generate(
conditionning_message,
new_text_callback=self.new_text_callback,

n_predict=len(conditionning_message),
temp=self.args.temp,
top_k=self.args.top_k,
top_p=self.args.top_p,
repeat_penalty=self.args.repeat_penalty,
repeat_last_n = self.args.repeat_last_n,
#seed=self.args.seed,
n_threads=8

)

then when a user writes a message, I do this:

self.current_message = "\nUser: " + message + "\nGPT4All: "
self.prepare_query(self.current_message)
self.chatbot_bindings.generate(
self.current_message,
new_text_callback=self.new_text_callback,#_with_yield,
n_predict=len(self.current_message)+self.args.n_predict, # HERE i need to do this as it starts by generating my prompt
temp=self.args.temp,
top_k=self.args.top_k,
top_p=self.args.top_p,
repeat_penalty=self.args.repeat_penalty,
repeat_last_n = self.args.repeat_last_n,
#seed=self.args.seed,
n_threads=8
)

Any help is appreciated