How does skyvern integrate with ollama litellm
alexsiu398 opened this issue · 20 comments
Is there a way or tutorial on how to configure ollama litellm to work with skyvern? How can skyvern work with a local llm?
Here's an example where @ykeremy built out bedrock support within Skyvern
https://github.com/Skyvern-AI/skyvern/pull/251/files
Are you open to opening a PR for ollama + litellm? We'd love a contribution here!
Ignore the files in the experimentation module. The other configs are all you need!
Nice. Now GPT4 60 USD in 3 days. :( Ollama is awesome! I don't know how to help!
@santiagoblanco22 we would love a contribution here!! Or maybe we can ask for people's help in our discord?
GPT4 is super expensive. Try it with Claude 3 sonnet instead
hi, I'm currently trying to add it. :)
Do you think we should allow all ollama models? in setup.sh should we ask the user for a specific model name(as a string)? or a numbered choice like for anthropic with just llama3/mistral maybe llava?
FYI for now it seems that most models available on Ollama are not good enough for Skyvern , at least on my computer, so it seems pointless to add models that would not work well.
Maybe it could work with a 34/70B model with no quantization, but you would need a very beefy setup, at that point you'd probably be better off using bedrock/anthropic IMO
This issue is stale because it has been open for 30 days with no activity.
This issue was closed because it has been inactive for 14 days since being marked as stale.