/BibiGPT

BibiGPT · One-click summary for video & audio content: Bilibili | YouTube | Websites丨Podcasts | Meetings | Local files, etc. 音视频内容一键总结:哔哩哔哩丨YouTube丨网页丨播客丨会议丨本地文件等 (原 BiliGPT 省流神器 & 课代表)

Primary LanguageTypeScriptGNU General Public License v3.0GPL-3.0

🤖 BibiGPT · One-click AI summary for video and audio content b.jimmylv.cn

🎉 (Formerly BiliGPT), now supports: One-click summary for Bilibili and YouTube video content, "Attention-saving tool & Class Representative".

🚧 Under development: Support for websites, podcasts, meetings, local audio and video files, etc. as input. The prompt and output ends are being continuously iterated. Stay tuned!

Alternative address: https://chat-bilibili-video.vercel.app


🤖 BibiGPT · 音视频内容一键总结 b.jimmylv.cn

🎉 (原 BiliGPT),现已经支持:哔哩哔哩丨 YouTube 视频内容一键总结,“省流神器 & 课代表”。

🚧 开发中:支持网页丨播客丨会议丨本地音视频文件等输入,Prompt 和输出端均在持续迭代中,敬请期待!

备用地址:https://chat-bilibili-video.vercel.app


🎬 This project summarizes Bilibili/YouTube/Podcast/Meeting/... videos or audios for you using AI.

🤯 Inspired by Nutlope/news-summarizer & zhengbangbo/chat-simplifier & lxfater/BilibiliSummary

BibiGPT音视频总结神器

🚀 First Launch: 【BibiGPT】AI 自动总结 B 站视频内容,GPT-3 智能提取并总结字幕

How it works

This project uses the OpenAI ChatGPT API (specifically, gpt-3.5-turbo) and Vercel Edge functions with streaming and Upstash for Redis cache and rate limiting. It fetches the content on a Bilibili video, sends it in a prompt to the GPT-3 API to summarize it via a Vercel Edge function, then streams the response back to the application.

Saving costs

Projects like this can get expensive so in order to save costs if you want to make your own version and share it publicly, I recommend three things:

  • 1. Implement rate limiting so people can't abuse your site
  • 2. Implement caching to avoid expensive AI re-generations
  • 3. Use text-curie-001 instead of text-dacinci-003 in the summarize edge function

Running Locally

After cloning the repo, go to OpenAI to make an account and put your API key in a file called .env.

Then, run the application in the command line and it will be available at http://localhost:3000.

npm run dev

One-Click Deploy

Deploy the example using Vercel:

Deploy with Vercel

Support -> Contact Me