Issues
- 0
block context should consider child blocks
#30 opened by Ingegneus - 0
Default formatting options for LLM generated text
#29 opened by ej159 - 1
- 0
API Endpoint `/api/generate` Not Supported by `llama.cpp` – Request for Compatibility
#27 opened by vanabel - 0
Feature Request: Add support for Open WebUI
#26 opened by working-name - 1
Sumarize images? OCR?
#25 opened by Oobert - 3
Multilanguage support
#24 opened by fabiojust - 2
- 0
- 6
Request: a spinner or progress bar for answers
#19 opened by BradKML - 7
What if we could query a PDF, f.eks ask it questions or ask it to generate revision questions
#12 opened by minfuel - 1
Ollama on Windows without WSL
#18 opened by BradKML - 2
Q: Can Ollama read the URLs from the blocks?
#20 opened by BradKML - 2
can ollama-context-menu do "summarize from block"
#17 opened by happy15 - 5
- 4
All Ollama commands return "undefined"
#14 opened by mrcn - 3
Ollama doesn't work in default journal
#4 opened by 0nxku - 2
add support for STT and TTS
#11 opened by minfuel - 1
Couldn't fulfill request make sure you don't have a typo in the name of the model or the host url
#3 opened by technicolourdream - 1
- 1
Request: Retain Prompt
#6 opened by mtremoulet - 2