bug: can't use model which pulled manually
Closed this issue · 2 comments
script-money commented
Problem
I follow the guide, pull model manually, but logs show error below
How to Reproduce
follow the notion: https://firstbatch.notion.site/How-to-Run-a-Node-ed2bef2c8eec4dd280286f2e081e51d2
- git clone https://github.com/firstbatchxyz/dkn-compute-node
- cd dkn-compute-node
- cp .env.example .env
- ollama pull llama3.1:latest
- modify .env (add private key)
- ./start.sh -m=llama3.1:latest
Expected Behaviour
Container run without error, can use llamda 3.1
Version
v0.1.2
Additional context
- Set
OLLAMA_AUTO_PULL=true
andDKN_MODELS=llama3.1:latest
in .env - ollama pull hellord/mxbai-embed-large-v1:f16
- run
./start.sh
will use Ollama:llama3.1:latest without error
erhant commented
hey! we just enabled auto-pull so this wont be a problem, P.S. there was issues using local Ollama vs docker Ollama which may be related, also fixed just now
follow the notion: https://firstbatch.notion.site/How-to-Run-a-Node-ed2bef2c8eec4dd280286f2e081e51d2
- git clone https://github.com/firstbatchxyz/dkn-compute-node
- cd dkn-compute-node
- cp .env.example .env
- ollama pull llama3.1:latest
- modify .env (add private key)
- ./start.sh -m=llama3.1:latest
here right between steps 4 and 5 we needed to do ollama pull hellord/mxbai-embed-large-v1:f16
as well
would you like to try again with the latest updates? then I can close this one
erhant commented
closing the issue, auto-pull solves this nevertheless