jlonge4/local_llama
This repo is to showcase how you can run a model locally and offline, free of OpenAI dependencies.
PythonApache-2.0
Stargazers
- 2tYfX8sgnrx
- 3rd1t
- a7t0fwa7
- akashmishra23Bangalore, India
- alexl83
- antonengelhardt@savestrike
- balsulami
- buphnezz
- bvk
- Curiosity007
- ddkang1
- Devorein@Charmverse
- dwdcthMens et Manus
- esnekoLatvia
- FMCalistoInstitute for Systems and Robotics
- GigeonVDB
- ibonito1Germany
- ichit
- jadedgnome
- JohnClaw
- LiuXinqi12
- lmelim
- mattbarron
- MywkGermany
- NeoAnthropoceneTurkey
- nickellcomputersNickell Computers
- nikfrrr
- noobAIcoder
- ordohereticus
- Ovoj
- paulcuongnguyen
- snorik
- tekkena
- Therealkorris
- willi-kapplerUniversity of Tübingen / Geoscience
- worthmining