varunvasudeva1/llm-server-docs
Documentation on setting up an LLM server on Debian from scratch, using Ollama/vLLM, Open WebUI, OpenedAI Speech, and ComfyUI.
Stargazers
- anupmehta2010
- apeiroo
- arbalLos Angeles, CA
- ashooner
- av1d
- boyiechen
- caprilesport@geem-lab
- cdpierse@weaviate
- ChristophAltBayer
- demisflanaganArizona
- Depth-Hoar
- dhivah
- gameraddiktedPA
- garyblankenship
- geekwolverine
- Goekdeniz-GuelmezComputacenter
- guojing0
- Hu-Zhao90RWTH Aachen University
- iHaagcom
- IncognitoPotatoByte
- JaharmiLandenberg, PA
- jrwren@cisco
- kengbailey
- Labiri
- MannavaVivekGermany
- markthomas93freelance
- mmulvahillSeattle, WA
- orlandoferrer
- pchalasaniMLResearch
- RickieChangNew York, NY
- SabryAbdAllah
- Sinnetech-sg
- siroyoc
- Spirixen
- tdronin
- xucian@main-quest @rockstartai