ehsan2003 opened this issue 9 months ago · 1 comments
is it possible to use chatgpt ( with an api key ) instead of local model?
With the new OpenAPI-compatible Ollama endpoints, this is definitely possible. I'll have to look into any potential downsides to changing the endpoints we call.