Issues
- 0
- 0
- 2
Where should I put the config file?
#109 opened by jm33-m0 - 0
How to use proxy env var
#108 opened by SethARhodes - 1
Inconsistent Virtual Text Placement with Tabs
#107 opened by Nimrod0901 - 5
check for llm-ls in PATH ?
#64 opened by teto - 0
How to use openai api?
#106 opened by 4t8dd - 7
`Tab` key not usable in insert mode
#61 opened by bogdan-the-great - 0
Chatbot with TUI
#104 opened by metal3d - 0
add system promt for FIM or other parameters.
#103 opened by meicale - 3
Feature Request: nvim-cmp support
#56 opened by bvolkmer - 1
- 4
expose callbacks
#66 opened by teto - 3
[Feat]: Improve DX
#96 opened by AlejandroSuero - 5
llm.nvim does not attach to the buffer
#100 opened by rhusiev - 5
- 1
ollama not working
#93 opened by nfwyst - 3
- 4
Error starting llm-ls
#86 opened by yduanBioinfo - 12
Neovim 0.10.0 support
#88 opened by roman3pm - 2
Extremely slow on Completion
#83 opened by gzfrozen - 7
Can't get to work with ollama
#79 opened by Bios-Marcel - 2
Rate Limit error on locally deployed model
#76 opened by V4G4X - 0
No LSP with LLM
#85 opened by Freaksed - 2
Can't use completions
#77 opened by Freaksed - 1
Unreachable LLM server blocks UI
#71 opened by RemcoSchrijver - 0
- 1
README suggests Ollama should work but it does not
#73 opened by Amzd - 3
- 2
Could this plugin work with HuggingChat?
#17 opened by Haze-sh - 4
- 2
lsp server not started properly
#65 opened by teto - 2
Add ability to specify file types to attach to.
#54 opened by dsully - 2
Start with auto suggestion off
#51 opened by Dranaxel - 4
- 4
LLM Error: ['Stop'] not used
#35 opened by gdnaesver - 4
- 1
- 7
- 2
- 2
Change authentication mechanism
#10 opened by McPatate - 0
any way to add plug with vim-plug?
#7 opened by cxwx