Simple tg bot that summarizes longread articles.
To run this project, you will need to add the following environment variables to your .env file
BOT_TOKEN
- Telegram bot token. Get it by creating in @BotFatherMODEL_NAME
- Summarization model name (options: "sshleifer/distilbart-cnn-12-6", "facebook/bart-large-cnn", " facebook/bart-large-xsum"). By default "sshleifer/distilbart-cnn-12-6"LOG_LEVEL
- Logging level (options: "DEBUG", "INFO", "WARNING", "ERROR", "CRITICAL"). By default "INFO"FROM_DOCKER
- If the app is running in Docker (options: 1, 0). By default 0
- Clone this repo
- Install/check pre-requisites
Run tests (at first, it will download the model, and it will take a few minutes)
pytest
Run the bot
python summary_bot/bot.py
Send to your Telegram bot a link to the article and get the summary!
See Deployment
This Telegram bot will extract the content of an article from a given URL and summarize it for you with the help of AI
transformers/examples/tensorflow/summarization at main · huggingface/transformers
sn4kebyt3/ru-bart-large at main
This is a smaller version of the facebook/mbart-large-50 with only Russian and English embeddings left.
- Запускаем контейнерные приложения в Yandex Serverless Containers | Yandex Cloud - Мероприятия и вебинары
- Потоковый анализ данных с использованием serverless-технологий | Yandex Cloud - Мероприятия и вебинары
- Практикум. Создание интерактивного serverless-приложения с использованием WebSocket | Yandex Cloud - Мероприятия и вебинары
- Практикум. Создание Telegram-бота с использованием serverless | Yandex Cloud - Мероприятия и вебинары
Long-polling - aiogram 3.2.0 documentation
Long-pollingLong-polling is a technology that allows a Telegram server to send updates in case when you don’t have dedicated IP address or port to receive webhooks for example on a developer machine.
Fast and reliable end-to-end testing for modern web apps | Playwright Python
transformers/examples/pytorch/summarization/run_summarization_no_trainer.py at main · huggingface/transformers Fine-tuning a 🤗 Transformers model on summarization