Can't specify Ollama endpoint URL in CodeGPT extension for VSCode
Closed this issue · 1 comments
pcgeek86 commented
I'm running Ollama on a bare metal Linux server, as a container using an NVIDIA GPU (GeForce RTX 3060 12GB). This server is separate from my Windows 11 developer workstation.
I installed CodeGPT extension on my dev workstation, and tried selecting Ollama as the provider. However, there's no option to specify the URL to my Linux server.
Question: How do I specify the URL to my Ollama service? ie. http://linuxserver.local:11434