jlonge4/local_llama
This repo is to showcase how you can run a model locally and offline, free of OpenAI dependencies.
PythonApache-2.0
Issues
- 7
ERROR: Failed building wheel for faiss-cpu
#15 opened by plsnotracking - 0
- 3
problem with ollama
#14 opened by odevroed - 3
m2 error upon first running
#12 opened by SuperCowboyDinosaur - 2
Awsome project!
#13 opened by mountainrocky - 0
FileNotFoundError: [Errno 2] No such file or directory: 'C:/Users/../GPT_INDEXES/None/docstore.json'
#11 opened by abhinandan1602 - 0
- 1
- 6
- 1
How can I run local-llama with Multi-GPU
#6 opened by Calmepro777 - 1
FileNotFoundError and model LLM location
#4 opened by brinrbc - 0
- 1