Confusity AI is fork of Clarity AI - a simple perplexity.ai clone. Use the code for whatever you like! :)
Given a query, Clarity fetches relevant, up-to-date information from the on prem atlassian confluence installation and uses OpenAI's API or any other compatible API on top of any LLM to generate an answer.
The app works as follows:
- Get query from user
- Scrape Confluence for relevant webpages
- Parse webpages for text
- Build prompt using query + webpage text
- Call OpenAI API to generate answer
- Stream answer back to user
Get OpenAI API key here. (optional)
- Clone repo
git clone https://github.com/rzrbld/confusity-ai
- Install dependencies
npm i
- Configure environment variables
Variable | Default Value | Description |
---|---|---|
CONFLUENCE_URL | 'https://my_onprem_confluence.company.com/' | URL of the Confluence server. |
CONFLUENCE_TOKEN | 'put_token_in_CONFLUENCE_TOKEN_env_variable' | Token used for authentication with Confluence API. |
OPENAI_URL | 'http://my_local_llm_URI_or_openai_endpoint.company.com/v1/chat/completions' | URL or endpoint for OpenAI API or any other compatible api such as oobabooga or FastChat |
OPENAI_APIKEY | 'put_openai_or_local_llm_token_here' | API key for accessing the OpenAI compatible API. |
OPENAI_MODEL | 'fancy_pancy_llm_3.5turbo' | Name of the OpenAI or compatible language model. |
OPENAI_TEMP | 0.0 | Temperature parameter for OpenAI API (a double). |
OPENAI_TOKENS | 250 | Number of tokens to use for OpenAI API (an integer). |
- Run app
npm run dev
-
Populate ENV variables in docker-compose
-
Run app
docker compose -f docker-compose.yml build
docker compose -f docker-compose.yml up
Shoutout to Perplexity AI and Clarity AI for the inspiration. I highly recommend checking their products out.
This repo is meant to show people that you can build powerful apps like Perplexity even if you don't have a large, experienced team.
LLMs are amazing, and I hope Clarity inspires you to build something cool!