foldl/chatllm.cpp

Model id for mistral nemo?

Closed this issue · 1 comments

I may be silly or something, but how am i supposed to run mistral nemo, if i don't see it's model_id anywhere? It says : Supported Models - 2024-07-17: Mistral Nemo. Ok, but how do i write it? python chatllm.py -i -m :????????

I just uploaded them.

python chatllm.py -m :mistral-nemo
    ________          __  __    __    __  ___ 
   / ____/ /_  ____ _/ /_/ /   / /   /  |/  /_________  ____
  / /   / __ \/ __ `/ __/ /   / /   / /|_/ // ___/ __ \/ __ \
 / /___/ / / / /_/ / /_/ /___/ /___/ /  / // /__/ /_/ / /_/ /
 \____/_/ /_/\__,_/\__/_____/_____/_/  /_(_)___/ .___/ .___/
You are served by Mistral,                    /_/   /_/
with 12247782400 (12.2B) parameters.

You  > hi
A.I. > Hello! How can I assist you today? Let me know if you have any questions or just want to chat. 😊

For models that are not listed by model_downloader.py, you may convert them manually.