technovangelist/ollama-node

How to send a Context (older Prompts and Answers) via generate()

Opened this issue · 3 comments

HI,

More of a question and/ or something for the backlog

with openAi I am able to send the Context (= older questions and answers) to openAi. It is done via something like this

[
      {
         role: 'user',
         content: question1
      },
      {
         role: 'assistant',
         content: answer1
      },
      {
         role: 'user',
         content: question2
      },
      {
         role: 'assistant',
         content: answer2
      },
]

Can this be done with ollama-node, also? ollama.genererate() simply takes ONE question, right?

The context is stored in a DB, so I can create an array like above by myself

A short look inside the small code base shows that the API of ollama supports sending chat messages. https://github.com/ollama/ollama/blob/main/examples/typescript-simplechat/client.ts and that the ollama-node do not use one of them. Also looked into several minutes ago, looks like I need to create a function for that myself.

you should use the official node library and not this one

you should use the official node library and not this one