Issues
- 0
extract pattern is brittle for custom prompts that require Actions other than "display"
#42 opened by johnsaigle - 0
Feed project directory as a context to model
#41 opened by dzdidi - 0
How to continue prompt.
#39 opened by awptechnologies - 0
Problem with plugins
#38 opened by laplantePierre - 0
[Enhancement] Edit before send
#37 opened by keesj - 3
- 1
setup codestral, packer nvim
#35 opened by nanvenomous - 1
Error on $sel prompts when text contains %
#25 opened by amfl - 0
[Enhancement] Markdown Preview Support
#33 opened by snoweuph - 1
Error text appears every time I do anything
#30 opened by mclarkson - 2
- 2
- 4
Stop Server not working
#22 opened by mvaldes14 - 1
Connection error in windows 10
#19 opened by chutao - 2
Streaming response
#27 opened by traverseda - 2
Remote connection fails with an error message: Error: model 'mistral' not found, try pulling it first
#17 opened by goshng - 1
- 0
Make sense to store response in a file?
#24 opened by kbwhodat - 1
Is it possible to use ollama in autocomplete mode?
#23 opened by dpecos - 0
variable for the word under the cursor
#18 opened by goshng - 0
- 1
chatgpt integration
#21 opened by ehsan2003 - 4
Add chatting support?
#7 opened by sandangel - 2
macos: vim.schedule error
#14 opened by alankritjoshi - 4
RAG support
#9 opened by sandangel - 2
Error: Column index is too high
#6 opened by madsamjp - 2
Map a keymap to run a specific prompt
#3 opened by gerazov - 1
Telescope extension
#2 opened by gerazov