Welcome to the Langchain_QA app repository, a powerful tool built on Google Gemini technology. Our QA app is designed to efficiently implement the RAG (Retrieval-Augmented Generation) model, leveraging cutting-edge technologies including Langchain, Chainlit, and Gemini for lightning-fast query search and response generation.
Key Features:
-
RAG Model Integration: Our QA app seamlessly integrates the RAG model, a state-of-the-art approach that combines retrieval-based and generation-based methods for improved question answering.
-
Efficient Query Search: With the power of Langchain, Chainlit, and Gemini, our app ensures lightning-fast query search capabilities. Users can expect accurate and relevant responses to their queries in record time.
-
Modular Coding: The repository follows best practices of modular coding, ensuring clean and maintainable codebase. Developers can easily navigate through the code, making enhancements and modifications a breeze.
-
MLOps with GitHub Actions: We embrace MLOps principles by automating the deployment pipeline using GitHub Actions. Continuous integration and continuous deployment (CI/CD) ensure that changes are tested, integrated, and deployed seamlessly.
-
Docker Support: We provide a Dockerfile to create containers, enabling hassle-free deployment and ensuring consistency in dependencies across different environments. Docker simplifies the packaging and deployment of our app, making it easy to scale and manage.
-
AWS Integration: Deploy our QA app with ease on AWS EC2 and ECR. Utilize the power of AWS cloud infrastructure to host and scale your application. EC2 provides resizable compute capacity in the cloud, while ECR securely stores and manages Docker container images.
I have used Github Actions for implementing CI/CD pipeline and AWS ECR for container registry of The Docker container and AWS EC2 for hosting it.
- Python
- Langchain
- Google Gemini
- Chainlit
- FAISS
- Docker
- Github Actions
- AWS ECR
- AWS EC2
I have commented most of the neccesary information in the respective files.
To run this project locally, please follow these steps:-
-
Clone the repository:
git clone https://github.com/Rajarshi12321/Langchain_QA.git
-
Create a Virtual Environment (Optional but recommended) It's a good practice to create a virtual environment to manage project dependencies. Run the following command:
conda create -p <Environment_Name> python==<python version> -y
Example:
conda create -p venv python=3.9 -y
Note:
- It is important to use python=3.9 for proper use of Langchain or else you would get unexpecterd errors
-
Activate the Virtual Environment (Optional) Activate the virtual environment based on your operating system:
conda activate <Environment_Name>/
Example:
conda activate venv/
-
Install Dependencies
- Navigate to the project directory:
cd [project_directory]
- Run the following command to install project dependencies:
pip install -r requirements.txt
Ensure you have Python installed on your system (Python 3.9 or higher is recommended).
Once the dependencies are installed, you're ready to use the project. - Navigate to the project directory:
-
Create a .env file in the root directory and add your Pinecone credentials as follows:
GOOGLE_API_KEY = "xxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
-
Run the Flask app: Execute the following code in your terminal.
chainlit run app.py -w
-
Access the app: Open your web browser and navigate to http://localhost:8000/ to use the House Price Prediction and Property Recommendation app.
I welcome contributions to improve the functionality and performance of the app. If you'd like to contribute, please follow these guidelines:
-
Fork the repository and create a new branch for your feature or bug fix.
-
Make your changes and ensure that the code is well-documented.
-
Test your changes thoroughly to maintain app reliability.
-
Create a pull request, detailing the purpose and changes made in your contribution.
Rajarshi Roy - royrajarshi0123@gmail.com
This project is licensed under the MIT License. Feel free to modify and distribute it as per the terms of the license.
I hope this README provides you with the necessary information to get started with the road to Generative AI with Google Gemini and LlamaIndex.