nuance1979/llama-server

RuntimeError: unexpectedly reached end of fil

Opened this issue · 0 comments

models.yml

model_home: /data/faqbotllama/models/
models:
  llama-7b:
    name: LLAMA-7B
    path: 7B/ggml-model-q4_0.bin  # relative to `model_home` or an absolute pathroot@nnet:/data/faqbotllama#

/usr/local/lib/python3.10/dist-packages/pydantic/_internal/_fields.py:149: UserWarning: Field "model_home" has conflict with protected namespace "model_".

You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`.
  warnings.warn(
llama.cpp: loading model from /data/faqbotllama/models/7B/ggml-model-q4_0.bin
Traceback (most recent call last):
  File "/usr/local/bin/llama-server", line 8, in <module>
    sys.exit(main())
  File "/usr/local/lib/python3.10/dist-packages/click/core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
  File "/usr/local/lib/python3.10/dist-packages/click/core.py", line 1078, in main
    rv = self.invoke(ctx)
  File "/usr/local/lib/python3.10/dist-packages/click/core.py", line 1434, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/usr/local/lib/python3.10/dist-packages/click/core.py", line 783, in invoke
    return __callback(*args, **kwargs)
  File "/usr/local/lib/python3.10/dist-packages/llama_server/server.py", line 169, in main
    globals()["model"] = Model(
  File "/usr/local/lib/python3.10/dist-packages/pyllamacpp/model.py", line 87, in __init__
    self._ctx = pp.llama_init_from_file(model_path, self.llama_params)
RuntimeError: unexpectedly reached end of file