This bot uses the Ollama model, and you can choose any model you like, with or without censorship, and so on. The functionality of this bot includes pinging it, replying to its messages with simple commands, interacting with it, obtaining information, and more.
Currently, we have only one command for usage:
/ask_ollama ask_string_text: str
- Ask something from Ollama. It also works with @bot
and with replies as well.
You can change the model and settings in configuration.py
and the bot token in .env
.
To install, you will need Python 3.11 or higher and create a .env
file based on .env.example
. Here is the instruction on how to install locally without docker
:
Note
You must have Ollama installed locally and the model you want to use.
$ git clone https://github.com/risknu/radist.git
$ cd radist; pip install -r requirements.txt
$ git clone https://github.com/risknu/radist.git
$ cd radist && pip install -r requirements.txt
Licensed under the Apache License 2.0