awaescher/OllamaSharp

How do you separate instruction from the content?

Closed this issue · 4 comments

To make sure the instruction is not being lost and clearly separated from the content.

Would you elaborate?

Please feel free to reopen this if you want to elaborate.

Using https://ollama.com/library/[llama3:instruct](https://ollama.com/library/llama3:instruct), how do you separate instruction and content?
With a normal model, the instruction -prompt, that I pass with

await foreach (var stream in ollama.Generate(prompt + ": " + content, context, cancellationToken))

is often ignored.

I am still unsure what you want to achieve. My best guess is you mean the separation from the system prompt and the user prompt.

If you want to use the Generate() method, you can use the overload that accepts an GenerateRequest which has a property System for the system prompt and a property Prompt for the user prompt as you can see here. The design of these classes comes from the official Ollama API, see here.

If you use the Chat class, you can do it like shown here.