Overview This repository houses the codebase for a cutting-edge Bhagavad Gita chatbot. The chatbot is designed to provide relevant responses to commonly asked questions about the Bhagavad Gita. The responses are generated using the LangChain generative AI framework, specifically leveraging the Llama2 model. For efficient storage and retrieval of vector embeddings, Pinecone is utilized. The frontend is developed using Flask, providing a seamless and user-friendly interface.
Features Custom Data Responses: The chatbot is trained on custom data to provide meaningful and contextually appropriate responses to user queries. LangChain Generative AI: Utilizing the state-of-the-art LangChain framework, with the Llama2 model, ensures the generation of high-quality responses. Efficient Vector Storage: Pinecone is employed for storing vector embeddings, enabling fast and accurate retrieval of information. Flask Frontend: The frontend is developed using Flask, offering a user-friendly interface to interact with the Bhagavad Gita chatbot.
Clone the repository
Project repo: https://github.com/](https://github.com/reenal/geeta-chatbot.git
conda create -n geetachatbot python=3.8 -y
conda activate geetachatbot
pip install -r requirements.txt
PINECONE_API_KEY = "xxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
PINECONE_API_ENV = "xxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
Download the quantize model from the link provided in model folder & keep the model in the model directory:
## Download the Llama 2 Model:
llama-2-7b-chat.ggmlv3.q4_0.bin
## From the following link:
https://huggingface.co/TheBloke/Llama-2-7B-Chat-GGML/tree/main
# run the following command
python store_index.py
# Finally run the following command
python app.py
Now,
open up localhost:
- Python
- LangChain
- Flask
- Meta Llama2
- Pinecone