/mattermost-plugin-ai

Mattermost plugin for LLMs

Primary LanguageGoApache License 2.0Apache-2.0

Mattermost AI Plugin

Screenshot

standard-readme compliant

Table of Contents

Background

🚀 Join the "AI Exchange" community server channel where Mattermost's open source community is sharing the latest AI resources and innovations!

The Mattermost AI plugin adds functionality to use a wide variety of open source self hosted and vendor-hosted LLMs like OpenAI and GPT4All within Mattermost.

This plugin is currently experimental. Contributions and suggestions are welcome, see below!

The Mattermost AI Plugin is used as part of the Mattermost OpenOps framework for responsible development of AI-enhanced workflows with the ability to maintain full data control and data portability across different AI backends.

Install Mattermost + mattermost-plugin-ai

On existing Mattermost server

  1. Download the latest release from https://github.com/mattermost/mattermost-plugin-ai/releases
  2. Upload it to your server via System Console > Plugin Management.
  3. Enable the plugin and configure the settings as desired.

Local Development

  1. Clone and enter this repository:
    git clone https://github.com/mattermost/mattermost-plugin-ai && cd mattermost-plugin-ai
  2. Install mattermost-plugin-ai on Mattermost:
    `MM_SERVICESETTINGS_SITEURL=http://localhost:8065 MM_ADMIN_USERNAME=<YOUR_USERNAME> MM_ADMIN_PASSWORD=<YOUR_PASSWORD> make deploy`
  3. Access Mattermost and configure the plugin:
  • Open Mattermost at http://localhost:8065
  • Select View in Browser
  • In the top left Mattermost menu, click System Console ➡️ Mattermost AI Plugin
  • Enable the plugin and configure plugin settings as desired. See Supported Backends.

Gitpod Demo

See our demo setup OpenOps for an easy to start demo.

Usage

Streaming Conversation

Chat with an LLM right inside the Mattermost interface. Answer are streamed so you don't have to wait:

Summarizing Thread

Thread Summarization

Use the post menu or the /summarize command to get a summary of the thread in a Direct Message from the AI Bot:

Summarizing Thread

Answer questions about Threads

Respond to the bot post to ask follow up questions:

Thread Interrogation

Chat anywhere

Just mention @ai anywhere in Mattermost to ask it to respond. It will be given the context of the thread you are participating in:

Chat anywhere

Create meeting summary

Create meeting summaries! Designed to work with the calls plugin's recording feature.

Meeting Summary

Personalisation

Context such as the current channel and user are supplied to the LLM when you make requests. Allowing customization of responses. Personalisation

User lookup (OpenAI exclusive)

The LLM can lookup other users on the system if you ask about them.

OpenAI exclusive for now since it requires the function API.

Channel posts lookup (OpenAI exclusive)

You can ask about other channels and the LLM can ingest posts from that channel. For example you can ask it to summarize the last few posts in a channel. Note, depending on if you have CRT enabled this may not behave as you expect. Personalisation

OpenAI exclusive for now since it requires the function API.

GitHub integration (OpenAI exclusive, requires GitHub plugin)

The LLM can attempt to lookup specific GitHub issues. For example you can paste a GitHub link into the chat and ask questions about it. Only the title and description for now. Github

OpenAI exclusive for now since it requires the function API.

React for me

Just for fun! Use the post menu to ask the bot to react to the post. It will try to pick an appropriate reaction.

reactforme.mp4

RLHF Feedback Collection

Bot posts have 👍 👎 icons that collect user feedback. The idea would be to use this as input for RLHF fine tuning or prompt development.

Supported Backends

All backends are configured in the system console settings page for the plugin. Make sure to select your preferred backend under AI Large Language Model service on the system console page after configuring.

OpenAI (recommended)

To set this up get an OpenAI api key. You will need to sign up for an account if you don't have one already. You can go to https://platform.openai.com/account/api-keys to create a new one.

Configure the key in the system console and add a model like gpt-4 (better) or gpt-3.5-turbo (faster and cheaper)

Anthropic

You will need to have an invite to the Anthropic API.

If you do you can create an APi key here: https://console.anthropic.com/account/keys

Configure the API key in the system console and configure a default model like claude-v1.

Azure OpenAI

You will need to ask Azure to enable OpenAI in your Azure account before you can use this API.

This api requires functions to be supported, and they are for now only on models version 0613 with API 2023-07-01-preview. They are avaiable on limited datacenters right now. For moment of writing this docs avaiable regions for gpt-35-turbo v0613 are: Canada East, East US, France Central, Japan East, North Central US, UK South. More info in azure docs

Once you have been approved, you can create a new OpenAI resource. With the resource created you get access to the API key and the endpoint url clicking in keys and endpoints option of the menu.

Finally you have to deploy the model that you are going to use, normally gpt-35-turbo, clicking in "Model deployments", and managing the models from there. (TIP: don't select auto-update on your deployed model, it will auto-downgrade it to 0301 within about 5-10 minutes)

Configure the API key and the endpoint url for OpenAI Compatible in the system console and configure a default model like gpt-35-turbo.

OpenAI Compatable

Can support any backend that is OpenAI compatable such as LocalAI which we use in the OpenOps demo.

Ask Sage

If you can to use the OpenAI api directly, it is recommended you do that. Ask Sage does not support response streaming leading to a worse user experience. API tokens have not been implemented by Ask Sage therefore the Ask Sage integration requires username and password stored in plaintext in the server configuration. Hopefully these limitations will be resolved.

To configure enter your username and password on the system console page and set the default model such as gpt-4 or gpt-3.5-turbo.

Community Resources

AI

Mattermost

Contributing

Thank you for your interest in contributing to our open source project! ❤️ To get started, please read the contributor guidelines for this repository.

License

This repository is licensed under Apache-2.