/ollama-telegram

🦙 Ollama Telegram bot, with advanced configuration

Primary LanguagePythonMIT LicenseMIT


🦙 Ollama Telegram Bot

Chat with your LLM, using Telegram bot!
Feel free to contribute!




Features

Here's features that you get out of the box:

  • Fully dockerized bot
  • Response streaming without ratelimit with SentenceBySentence method
  • Mention [@] bot in group to receive answer

Roadmap

  • Proper Docker Config
  • Add more API-related functions [System Prompt Editor, Ollama Version fetcher, etc.]
  • Redis DB integration
  • Implement history [Bot can't remember more that 1 prompt]

Prerequisites

Installation (Non-Docker)

  • Install latest Python
  • Clone Repository
git clone https://github.com/ruecat/ollama-telegram
  • Install requirements from requirements.txt
pip install -r requirements.txt
  • Enter all values in .env.example

  • Rename .env.example -> .env

  • Launch bot

python3 run.py

Installation (Docker-Compose)

  • Clone Repository
git clone https://github.com/ruecat/ollama-telegram
  • Enter all values in .env.example

  • Rename .env.example -> .env

  • Run ONE of the following docker compose commands to start:

    1. To run ollama in docker container (optionally: uncomment GPU part of docker-compose.yml file to enable Nvidia GPU)
    docker compose up --build -d
    
    1. To run ollama from locally installed instance (mainly for MacOS, since docker image doesn't support Apple GPU acceleration yet):
    docker compose up --build -d ollama-telegram
    

Environment Configuration

Parameter Description Required? Default Value Example
TOKEN Your Telegram bot token.
[How to get token?]
Yes yourtoken MTA0M****.GY5L5F.****g*****5k
ADMIN_IDS Telegram user IDs of admins.
These can change model and control the bot.
Yes 1234567890
OR
1234567890,0987654321, etc.
USER_IDS Telegram user IDs of regular users.
These only can chat with the bot.
Yes 1234567890
OR
1234567890,0987654321, etc.
INITMODEL Default LLM No llama2 mistral:latest
mistral:7b-instruct

Credits

Libraries used