EricLBuehler/candle-vllm
Efficent platform for inference and serving local LLMs including an OpenAI compatible API server.
RustMIT
Stargazers
- bigpo
- bm777DeepFile
- di-oscbeijing
- dongxiaolong
- EliseBuehler2000
- EricLBuehler
- esmeetuShanghai
- fly51flyPRIS
- GoYiz
- grant1842
- iakashpaul
- IamGianlucaOntra.ai
- Jin2022中国
- jvmncs@zed-industries
- kandyjam
- kstavro
- lamm-mitMIT
- LLukas22Germany
- matthewford@bitzesty
- meymchen
- michaelfeilgradient.ai @Preemo-Inc
- mokeyishSomewhere on earth
- MonadKaiBeijing, China
- nf1874
- oraix
- patrick-fitzgeraldNanaimo, BC
- quantchang
- raonigabrielCuritiba - Paraná, Brazil
- sachaarbonel@GetStream
- sanyexieai
- ToluClassicsUniversity of Waterloo, Waterloo
- ungerpeterWinterthur, Switzerland
- vsndev3
- wh7f
- Zmu-paramount
- ZumusLos Angeles