Breaking changes for version 3.0
awaescher opened this issue · 0 comments
Discussed in #63
Originally posted by awaescher August 23, 2024
Thanks everyone for using and improving OllamaSharp. I learned to love this little project, especially because we have some really engaged contributors around.
I want OllamaSharp to stay the best Ollama API bindings, that's why I suggest to the following changes for an upcoming release. As I am proposing breaking changes, I will bump the major version to 3 in near future.
Drop streamer callback support
Right now, OllamaSharp supports multiple ways to react to incoming streams from Ollama.
First, I started with a IResponseStreamer<T>
callback that could be passed as an argument. Later, @JerrettDavis added IAsyncEnumerable syntax support:
// streamer callback
ollama.PullModel("mistral", status => Console.WriteLine($"({status.Percent}%) {status.Status}"));
// IAsyncEnumerable
await foreach (var status in ollama.PullModel("mistral"))
Console.WriteLine($"({status.Percent}%) {status.Status}");
I prefer the IAsyncEnumerable
syntax as it is more flexible and easier to read.
Providing both syntax options leaves new developers with too many obsolete choices. It also bloats the code base unnecessarily.
That's why I decided to drop the support for the streamer callback syntax starting from version 3.0. IAsyncEnumerable
will be the way to go.
Improving chats
Also, it's not so clear how to implement a chat. There is a very helpful Chat
class that acts like a wrapper to the Ollama API and automatically collects and transmits the chat history. It can be instantiated directly but there's also an extension method Chat()
on the IOllamaApiClient
that returns an instance. So far so good, but the IOllamaApiClient
also provides some chat-related methods, that or more or less internal. That makes three:
Chat()
(the extension method) Starts a new chat and returns theChat
class instance (preferred way)Chat()
sends a chat request message to the Ollama API (sync, no streaming)SendChat()
sends a chat request message to the Ollama API (async, streaming)StreamChat()
sends a chat request message to the Ollama API (async, IAsyncEnumerable)
By dropping the streamer syntax support, SendChat()
should be gone too. I should drop the pretty useless sync Chat()
too. I might also drop the extension method in favor for instantiating the Chat
instance directly.
As always, feedback is welcome. Contributions even more.