Code completion LSP for Helix utilising the OpenAI chat API.
- No dependencies
- Straight completion. No code improvements/explanations.
- No key bindings, just automatic completion suggestions.
You can configure helix-gpt by exposing either the environment variables below, or passing the command line options directly to helix-gpt in the helix configuration step.
Environment vars
OPENAI_MODEL=gpt-3.5-turbo-16k # Optional
OPENAI_API_KEY=123 # required
OPENAI_MAX_TOKENS=7000 # optional
OPENAI_CONTEXT="A terrible code completion assistant" # Optional
OPENAI_ENDPOINT=https://api.openai.com/v1/chat/completions # Optional
LOG_FILE=/app/debug-helix-gpt.log # Optional
Args (add to command = "helix-gpt"
below)
--openaiModel gpt-3.5-turbo --openaiKey 123 --logFile /app/debug-helix-gpt.log --openaiContext "A terrible code completion assistant"
TypeScript example .helix/languages.toml
tested with helix 23.10 (older versions may not support multiple LSPs)
[language-server.gpt]
command = "helix-gpt"
[language-server.ts]
command = "typescript-language-server"
args = ["--stdio"]
language-id = "javascript"
[[language]]
name = "typescript"
language-servers = [
"ts",
"gpt"
]
If you choose not to use the precompiled binary, modify the first command to be:
[language-server.gpt]
command = "bun"
args = ["run", "/app/helix-gpt.js"]
This was made to run with Bun, but you can find a binary below with the runtime included.
Without bun
wget https://github.com/leona/helix-gpt/releases/download/0.1/helix-gpt-0.1-x86_64-linux.tar.gz -O /tmp/helix-gpt.tar.gz && tar -zxvf helix-gpt.tar.gz && mv helix-gpt-0.1-x86_64-linux /usr/bin/helix-gpt && chmod +x /usr/bin/helix-gpt
With bun (must use the args option in the previous step)
wget https://github.com/leona/helix-gpt/releases/download/0.1/helix-gpt-0.1.js -O helix-gpt.js
If you are having issues, check both the helix-gpt and helix log files.
tail -f /root/.cache/helix/helix.log
tail -f /app/helix-gpt.log
rsc1975 for their bun Dockerfile
- Copilot support
- Self hosted model support
- inlineCompletionProvider (if/when Helix gets support)
- Error fixing assistant