This repository contains the implementation of the Sector AI Telegram Bot, an advanced chatbot designed to interact with users on Telegram. The secret sauce is LangChain. The bot supports various functionalities, including autoreply decision-making, structured output parsing for polls, multimodal image processing, emoji responses, code generation, and more. Below is a detailed description of all components of this repository.
To download and manage the recommended models, you will need to install Ollama. Follow these steps to install Ollama and download the models:
-
Install Ollama: https://ollama.com/download
-
Download the recommended models:
ollama pull llama3.1:8b ollama pull llava-llama3:8b
Make sure you have enough storage space and a stable internet connection to download these models.
- Llama 3.1
- Llava Llama 3 (for vision processing)
- Gemma 2
- Mistral Nemo
- Almost anything will work for the basics. However, some models aren't powerful enough for the structured output parsing involved in autoreply, decision, poll, and topic commands.
This project uses the python-telegram-bot
library, with various command handlers and message handlers to manage different interactions. Below is a summary of each command:
- Description: Sends a welcome message introducing the bot and its basic commands.
- Usage:
/start
- Description: Initiates or continues a chat interaction with the user, using AI to generate responses based on the context of the conversation.
- Usage:
/chat [text]
(where[text]
is the user's input)
- Description: Provides a summary of the current chat messages in context, potentially generated by the AI to quickly grasp the conversation’s essence.
- Usage:
/summarize
- Description: Creates emojis based on user input or specific prompts related to emotions or themes relevant to the chat.
- Usage:
/emoji [prompt]
(where[prompt]
is a description of what kind of emoji you want)
- Description: Creates a poll with multiple options based on user input, useful for gathering feedback.
- Usage:
/poll [question]
(where[question]
is the description of the poll)
- Description: Creates a poll where users can vote on topics discussed recently in the chat context.
- Usage:
/topic
- Description: Uses AI to decide between two possible outcomes based on user input.
- Usage:
/decide [prompt]
(where[prompt]
is the question you want decided by the bot)
- Description: Generates a code snippet according to specific requirements or inputs provided by the user.
- Usage:
/code [specification]
(where[specification]
describes what kind of code to generate, e.g., Python function)
- Description: Renders a mock HTML website based on given specifications or details from the user.
- Usage:
/html [specification]
(where[specification]
includes layout and content requirements)
- Description: Creates an SVG graphic according to detailed descriptions provided by users, often with humorous or nonsensical results.
- Usage:
/svg [description]
(where[description]
can be anything from a simple shape to a complex scene)
- Description: Checks the current length of the chat context in terms of tokens used, which could be relevant for understanding memory or performance constraints.
- Usage:
/tokens
- Description: Enables or disables automatic replies triggered by the LLM decision. When enabled, the bot will automatically respond to user messages and images.
- Usage:
/autoreply
to toggle the feature
- Description: Resets the context of the conversation, effectively starting a new chat from scratch.
- Usage:
/clear
- Description: Displays information about the current AI model being used by the bot.
- Usage:
/model
- Description: Allows administrators to set or modify the system prompt that guides the behavior of the AI within the chat context.
- Usage:
/system [prompt]
where[prompt]
is the new system message you want the bot to follow.
- Description: Similar to setting a system prompt but might be used for generating more detailed or specialized instructions for the AI based on specific user requirements.
- Usage:
/characterize [details]
where[details]
are a thing for the LLM to roleplay as.
- Description: Allows administrators to switch or configure different AI models used by the bot, which can affect response quality and style.
To get started with the Sector AI Telegram Bot, follow these steps:
-
Clone the repository:
git clone https://github.com/roryeckel/sector-ai.git cd sector-ai
-
Install dependencies:
pip install -r requirements.txt
-
Configure the bot:
- Copy the
default_config.json
to a new file outside the sector-ai directory (e.g.,config.json
). - Add your Telegram bot token and other configuration settings to the
config.json
file.
- Copy the
-
Run the bot:
- Go back to the parent directory of sector-ai (
cd ..
) and run the following command to start the bot:
python -m sector-ai --config config.json
- Go back to the parent directory of sector-ai (