firstbatchxyz/dkn-compute-node

bug: can't use model which pulled manually

Closed this issue · 2 comments

Problem

I follow the guide, pull model manually, but logs show error below

How to Reproduce

follow the notion: https://firstbatch.notion.site/How-to-Run-a-Node-ed2bef2c8eec4dd280286f2e081e51d2

  1. git clone https://github.com/firstbatchxyz/dkn-compute-node
  2. cd dkn-compute-node
  3. cp .env.example .env
  4. ollama pull llama3.1:latest
  5. modify .env (add private key)
  6. ./start.sh -m=llama3.1:latest

Expected Behaviour

Container run without error, can use llamda 3.1

Version

v0.1.2

Additional context

  1. Set OLLAMA_AUTO_PULL=true and DKN_MODELS=llama3.1:latest in .env
  2. ollama pull hellord/mxbai-embed-large-v1:f16
  3. run ./start.sh will use Ollama:llama3.1:latest without error

hey! we just enabled auto-pull so this wont be a problem, P.S. there was issues using local Ollama vs docker Ollama which may be related, also fixed just now

follow the notion: https://firstbatch.notion.site/How-to-Run-a-Node-ed2bef2c8eec4dd280286f2e081e51d2

  1. git clone https://github.com/firstbatchxyz/dkn-compute-node
  2. cd dkn-compute-node
  3. cp .env.example .env
  4. ollama pull llama3.1:latest
  5. modify .env (add private key)
  6. ./start.sh -m=llama3.1:latest

here right between steps 4 and 5 we needed to do ollama pull hellord/mxbai-embed-large-v1:f16 as well

would you like to try again with the latest updates? then I can close this one

closing the issue, auto-pull solves this nevertheless