Open source LLM-based chatbots for customer support
Buff is a conversation engine that indexes support articles with a vector database and answers customer questions with a LLM. Buff does not replace live chat software like Zendesk or Freshworks. Instead, it integrates with the chat interfaces companies already use and gives them language capabilities on par with ChatGPT. Buff excels at responding to customer inquiries about a product and can achieve high (60%+) deflection rates without compromising on CSAT.
Buff is backed by Y Combinator.
Compared to traditional NLU-driven chatbots, Buff:
- Can be set up in a few hours and performs well out of the box, since it does not need to be trained on prior examples of customer conversations
- Does not require code migrations or change management, since it integrates with existing live chat software instead of replacing them
- Can deflect a higher percentage of tickets, since it handles unexpected inputs without having to define intents ahead of time
Since general-purpose LLMs tend to have high variability in outputs, Buff is better suited for use cases with a large number of low-stakes interactions such as ecommerce and troubleshooting.
- Discord: 🦾 Done
- Freshchat: 🦾 Done
- Intercom: 🚧 Planned
- Zendesk: 🚧 Planned
To get on the hosted version, contact us on Discord or sign up here.
To get started on the self-hosted version, create a Discord bot and give it the necessary permissions to read message content and post messages. Swap out the placeholders in server/discord/bot.py
with your OpenAI API Key and your Discord Bot Token. Then just run server/discord/bot.py
from your server with nohup
or systemd
to keep the script running.