OllamaSharp is a .NET binding for the Ollama API, making it easy to interact with Ollama using your favorite .NET languages.
- Intuitive API client: Set up and interact with Ollama in just a few lines of code.
- API endpoint coverage: Support for all Ollama API endpoints including chats, embeddings, listing models, pulling and creating new models, and more.
- Real-time streaming: Stream responses directly to your application.
- Progress reporting: Get real-time progress feedback on tasks like model pulling.
- API Console: A ready-to-use API console to chat and manage your Ollama host remotely
OllamaSharp wraps every Ollama API endpoint in awaitable methods that fully support response streaming.
The follow list shows a few examples to get a glimpse on how easy it is to use. The list is not complete.
// set up the client
var uri = new Uri("http://localhost:11434");
var ollama = new OllamaApiClient(uri);
// select a model which should be used for further operations
ollama.SelectedModel = "llama2";
var models = await ollama.ListLocalModels();
await ollama.PullModel("mistral", status => Console.WriteLine($"({status.Percent}%) {status.Status}"));
// keep reusing the context to keep the chat topic going
ConversationContext context = null;
context = await ollama.StreamCompletion("How are you today?", context, stream => Console.Write(stream.Response));
// uses the /chat api from Ollama 0.1.14
// messages including their roles will automatically be tracked within the chat object
var chat = ollama.Chat(stream => Console.WriteLine(stream.Message?.Content ?? ""));
while (true)
{
var message = Console.ReadLine();
await chat.Send(message);
}
This project ships a full-featured demo console for all endpoints the Ollama API is exposing.
This is not only a great resource to learn about OllamaSharp, it can also be used to manage and chat with the Ollama host remotely. Image chat is supported for multi modal models.
Icon and name were reused from the amazing Ollama project.