Repository to store the large language model (LLM) firebreak project of creating an LLM (using LangChain) to query documentation. Initially the scope was to link directly to our confluence pages.
However, in the interests of time and security, this proof of concept was created using local copies of Word Documents (exported confluence pages).
To use this project, you will require an Open AI API Key that is defined in a .env
file.
- Clone the repository.
- Ensure the environment pre-requisits are met (see the
requirements
file). - Create a
.env
file specifiying your Open AI API key. Note that this key should be defined as a string, variable namedOPEN_API_KEY
. see example below in README. - Add any documentation you want processed as part of the model into the
data/
folder. Note: this currently only supports Word Documents (.docx
). - Run the Streamlit app on your local machine.
Note: You can alter the temperature
of the LLM using the slider in the app.
To run the Documentation Bot you will need to use Streamlit in local hosting mode.
Navigate to the cloned repository folder and (in terminal) run:
streamlit run confluence_bot.py
You should then be presented with a URL to your internally hosted Streamlit app.
OPEN_API_KEY='XXXXXXXXXX'