mostlygeek/llama-swap
On-demand model switching with llama.cpp (or other OpenAI compatible backends)
GoMIT
Stargazers
- sammcjMelbourne, Australia
- asalimod
- red-co
- ishan-marikarColombo
- kth8
- perk11Chicago, IL
- mbeier96Beeville, TX
- tuxcanflyRemote
- richardson-tmUnited Kingdom
- jukofyorkUK
- JohnClaw
- jabberjabberjabber
- smpurkisTotton
- WonderRico
- SachiaLanlus
- lemon-mintSeoul
- win4r1600 Pennsylvania Ave NW, Washington, DC 20500, United States
- benbarnard-OMISpringfield, IL
- shinedog
- the-crypt-keeper
- sigridjinethSeoul • .°• Bay Area (SF)
- neil-vqa
- cubiwan
- swimmesbergerAustria
- choppertime
- mattraj
- StrangeBytesDev
- mrauha
- tovacinni
- fly51flyBeiJing
- xyxu
- Async0x42
- gildas-ldParis
- ccrvlh
- kristaller486
- LoopControl