JudiniLabs/code-gpt-docs

Can't specify Ollama endpoint URL in CodeGPT extension for VSCode

Closed this issue · 1 comments

I'm running Ollama on a bare metal Linux server, as a container using an NVIDIA GPU (GeForce RTX 3060 12GB). This server is separate from my Windows 11 developer workstation.

I installed CodeGPT extension on my dev workstation, and tried selecting Ollama as the provider. However, there's no option to specify the URL to my Linux server.

Question: How do I specify the URL to my Ollama service? ie. http://linuxserver.local:11434

Looks like this feedback was already submitted in #227, so I'll close this. Looks like it's still not possible.