comet-ml/opik

[FR]: Support for Local Models

Closed this issue ยท 5 comments

Willingness to contribute

No. I can't contribute this feature at this time.

Proposal summary

Provide support for local models via Ollama or LM Studio.

Motivation

Evaluating the performance of locally deployed models could add a lot of value. Evaluating the performance of a fine-tune or within a custom workflow would make this tool super valuable for local prototyping and testing.

Hey @Mr-Moonsilver

This is something we are actively looking into, how are you using Ollama today ? Are you using it from the command line or through the Python SDK for example ?

@jverre that is really exciting indeed! I'm using it almost exclusively through the python sdk. I am hosting models locally, some are fine tuned, and using the ollama API to interact with these models.

Thanks @Mr-Moonsilver, let me take a look and get back to you

Hey @Mr-Moonsilver
I took a look and found 3 different ways to integration Ollama with Opik ! I've created a new documentation page with more information about these here: https://www.comet.com/docs/opik/tracing/integrations/ollama

Let me know what you think

I think this is amazing. Thank you very much!