Issues
- 5
- 31
- 2
cargo install error when update rust 1.80.0
#101 opened by haunt98 - 0
LSP compliance textDocument/completion
#108 opened by s1n7ax - 1
ollama backend does not support api keys
#106 opened by Krakonos - 1
Deepseek Coder not working
#92 opened by rhusiev - 2
In Windows, it will cause bug sometimes.
#45 opened by mikeshi80 - 1
- 1
Use as backend for chat-style UI
#98 opened by raine - 2
Can't process response from llamacpp server
#97 opened by gergap - 1
Respect XDG environment variables
#90 opened by life00 - 2
Can't accept completions
#87 opened by Freaksed - 2
Proposal: Launching LLM server as a daemon
#89 opened by blmarket - 0
- 1
[LLM] missing field `request_params`
#93 opened by Terr2048 - 1
Completions not displaying in some cases
#63 opened by Wats0ns - 2
Cannot build testbed on Windows
#54 opened by noahbald - 3
codellama unusable with llm-ls 0.5.1
#76 opened by williamspatrick - 8
refactor: adaptor list should be an enum
#69 opened by McPatate - 3
[Suggestion] Metrics support
#56 opened by DanielAdari - 2
- 1
won't work on GLIBC==2.31
#41 opened by bonuschild - 3
- 1
emacs support?
#55 opened by NightMachinery - 1
support for helix editor
#49 opened by hemedani - 3
feat: add support for ollama
#17 opened by McPatate - 0
feat: add support for llama.cpp
#28 opened by McPatate - 3
Help: Starting
#53 opened by aemonge - 2
[BUG] Server doesn't start after reboot
#37 opened by mdcrivello - 4
Too much logging
#35 opened by cmosguy - 1
feat: add support for self-signed certificates
#36 opened by SteiMi - 4
What's the purpose of this project?
#32 opened by ramsey-coding - 4
- 3
[BUG] Server doesn't start on NixOS
#24 opened by erkkimon - 0
Use tokenizer to fit context window
#4 opened by McPatate