afaqueumer/DocQA

Could not load Llama model

Opened this issue · 2 comments

I tried running this application in windows and ubuntu but getting the same error.

Can someone suggest whats wrong here?

File "/home/soft/.local/share/virtualenvs/DocQA-YcyGHhWa/lib/python3.9/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 552, in _run_script
exec(code, module.dict)
File "/home/soft/Mine/fakellm/DocQA/app.py", line 42, in
llm = LlamaCpp(model_path="./models/llama-7b.ggmlv3.q4_0.bin")
File "/home/soft/.local/share/virtualenvs/DocQA-YcyGHhWa/lib/python3.9/site-packages/langchain/load/serializable.py", line 75, in init
super().init(**kwargs)
File "pydantic/main.py", line 341, in pydantic.main.BaseModel.init
pydantic.error_wrappers.ValidationError: 1 validation error for LlamaCpp
root
Could not load Llama model from path: ./models/llama-7b.ggmlv3.q4_0.bin. Received error (type=value_error)

I tried lots of different paths. maybe a permissions issue? So far I've got to the upload documents page using the below workaround:

mkdir llama
git clone https://github.com/afaqueumer/DocQA.git
cd Doc*
python -m venv env
env/Scripts/activate
pip install -r requirements.txt
python -m streamlit run app.py

Thanks for the comments. Its not permission issues, I have verified. I have been trying multiple stuff here in both windows and ubuntu but facing the same issues