awaescher/OllamaSharp

How to know when completion has ended with Chat

Closed this issue · 2 comments

Hello! I hope you are well. Really cool useful wrapper for Ollama, we use it now that Ollama is available on Windows. I wanted to know if there is a way to know when the completion has ended (no more tokens generated) with the Chat class.

public class OllamaClient
{
    private const string OllamaApiUri = "http://localhost:11434";
    private const string OllamaModel = "DaVinci-v1:latest";

    private readonly OllamaApiClient _ollama = new(new Uri(OllamaApiUri))
    {
        SelectedModel = OllamaModel
    };

    private Chat? _chat;

    public async Task Setup(Action<ChatResponseStream> streamer)
    {
        var models = await _ollama.ListLocalModels();
        Console.WriteLine("Found the following available models : ");
        foreach (var model in models) Console.WriteLine(model.Name);

        _chat = _ollama.Chat(streamer);
    }

    /// <summary>
    /// Asks the model to generate a completion based on the input
    /// </summary>
    public async Task PerformInference(ChatRole chatRole, string input)
    {
        await _chat.SendAs(chatRole, input);
    }
    
    // How do we know when the inference is over?
}

I'd like to be able to perform some code when the completion is over.

Is there a way to know when the completion has completed?

The streamer lambda receives these ChatResponseStream objects to stream the server response. There is a Done property that should be set to true for the last answer.

image

Really cool stuff, thanks @awaescher !