Efficiently use Langchain for Complex Tasks
Maintained by Developers of legalyze.ai
Prerequisites:
git clone https://github.com/Haste171/langchain-chatbot.git
Recommended to use a virtual environment
pip install -r requirements.txt
Reference example.env to create credentials file
OPENAI_API_KEY=
PINECONE_API_KEY=
PINECONE_ENVIRONMENT=
PINECONE_INDEX=
Run Terminal Interface
python chatbot.py
Run Chat Interface
streamlit run streamlit.py
Click to expand
For usage of the terminal interface place files in the docs folder to be ingested
Once Files are ingested one can choose to ingest more files in future usage or just query the existing vector database
For usage of the chat interface upload files directly to the Browse Files
section
- Temperature: The amount of creativity/burstiness the AI will use when querying files
- Sources: The amount of sources the AI will base it's answer off of and use for context
- Conversational answers with chat history
- Compatibility for PDF documents (more soon)
- Local and external vector database compatibility
Soon:
- Externally hosted user accessible chat interface
- Compatibility with multiple files types (Llama Index)
- Compatibility with offline models (HuggingFace, Vicuna, Alpaca)
If you would like to contribute to the LangChain Chatbot, please follow these steps:
- Fork the repository
- Create a new branch for your feature or bug fix
- Write tests for your changes
- Implement your changes and ensure that all tests pass
- Submit a pull request
The LangChain Chatbot was developed by Haste171 with much inspiration from Mayo with the GPT4 & LangChain Chatbot for large PDF docs. This project is mainly a port to Python from the Mayo chatbot.
The LangChain Chatbot is released under the MIT License.