An proxy worker for using ollama in cursor
This is a proxy worker for using ollama in cursor. It is a simple server that forwards requests to the ollama server and returns the response.
When we use llm prediction on cusor editor, the editor sends to the data to the official cursor server, and the server sends the data to the ollama server. Therefore, even if the endpoint is set to localhost in the cursor editor configuration, the cursor server cannot send communication to the local server. So, we need a proxy worker that can forward the data to the ollama server.
- deno
- ollama server
-
Launch the ollama server
-
Launch curxy
deno run -A jsr:@ryoppippi/curxy
if you limit the access to the ollama server, you can set
OPENAI_API_KEY
environment variable.OPENAI_API_KEY=your_openai_api_key deno run -A jsr:@ryoppippi/curxy Listening on http://127.0.0.1:62192/ ◐ Starting cloudflared tunnel to http://127.0.0.1:62192 5:39:59 PM Server running at: https://remaining-chen-composition-dressed.trycloudflare.com
You can get the public URL hosted by cloudflare.
-
Enter the URL provided by
curxy
with /v1 appended to it into the "Override OpenAl Base URL" section of the cursor editor configuration.
- Add model names you want to "Model Names" section of the cursor editor configuration.
-
(Optional): Additionally, if you want to restrict access to this Proxy Server for security reasons, you can set the OPENAI_API_KEY as an environment variable, which will enable access restrictions based on the key.
-
Enjoy!
Also, you can see help message by deno run -A jsr:@ryoppippi/curxy --help
MIT