innightwolfsleep/llm_telegram_bot

Problem in finding model paths

BurykinNikolay opened this issue · 1 comments

I keep running into the model directory error.

I put ggml-vic13b-q5_0.bin into llama-cpp-telegram_bot/models
Change directory way in telegram_llm_model_path.txt from "models<model_file_name.bin>" to "models/ggml-vic13b-q5_0.bin"

After run python3 main.py I see this log:

image

Hi!
Readme updated. It was incorectly cloned from neighbour project.

First, move to dir where main.py placed (cd /usr/develop/text-generator-webui/extensions/llama-cpp-telegram_bot)
Then - python main.py

perhaps, you need change \ and /

p.s. please, give feedback - if it help or not. I haven't tested under linux, so it is usefull to know - if it working.