A self hosted web app for building workflows and tools around LLMs.
- Simple LLM chat with threads
- Configurable Agents (system prompt, model)
- OpenAI model selection options here
- Web Push notifications via service worker and
web-push
library - (incomplete) Basic background worker & queue for scheduling actions
- Next.js app router
- Prisma ORM
- SQLite
- Clerk for authentication
- Redis & BullMQ for background tasks
- Cloudflare Tunnel & Docker for self-hosting
- Clone the repo
- Setup external dependencies
- Copy
.env.example
to.env
and fill in the values
- Run
npm install
to install dependencies - Run
npm run db:push
to create the database - Run
npm run dev
to start the app services in development mode
- Run
docker-compose up
to start the app services and the Cloudflare Tunnel