ollama-webui/ollama-webui-lite
This repo is no longer maintained, please use our main Open WebUI repo.
SvelteMIT
Issues
- 1
- 0
什么时候可以支持上传图像文件,进行llava模型的支持?
#30 opened by itcareer - 1
Force Ollama to run in MACOS GPU Core - Flag Requirements : ollama run llama2-uncensored - GPU<>
#24 opened by akramIOT - 0
[Feature Request] Why not `npx owui`?
#27 opened by jikkuatwork - 6
Error: ollama version is 0
#21 opened by Matrixsun - 1
Doesn't connect to Ollama when I try accessing the localhost via a Cloudflare tunnel or via IP address w/ open port over the internet
#25 opened by mountainmonkey2 - 0
is it possible to use vllm without ollama?
#23 opened by juud79 - 1
Feature Request: Save Chats in Web Interface
#22 opened by Arthur-Stuhl - 1
feat: release with prebuilt frontend
#6 opened by tjbck - 2
- 2
doc: exposing ollama-webui-lite on the internet
#14 opened by c9482 - 3
- 1
Not receiving any response to a chat request
#13 opened by c9482 - 0
doc: list good use cases
#11 opened by tjbck - 2
Route for chat appears to be missing
#9 opened by Flowm - 1
when?
#2 opened by donuts-are-good - 0
feat: modelfiles & prompts integration
#7 opened by tjbck - 1
Stable difussion integration or something?
#1 opened by andzejsp