This repository contains code for a question-answering assistant, developed for the NOMAD LLM hackathon. The assistant utilizes the Retrieval-Augmented Generation (RAG) approach along with LLAMA Index to provide accurate responses to user queries within the NOMAD toolkit domain.
-
Clone the repository: git clone <repository_url> cd <repository_name>
-
Install the required dependencies: pip install -r requirements.txt
-
Ensure you have the necessary documents in the
data
folder for optimizing RAG. If not, please provide the required documents in thedata
folder. -
Run the Streamlit app: streamlit run my_app.py
-
Access the app in your web browser at
http://localhost:8501
.
rag.ipynb
: Jupyter Notebook containing code for building the LLAMA Index model and performing question-answering tasks.my_app.py
: Python script for the Streamlit web application, providing a user interface for querying the LLAMA Index model.requirements.txt
: List of Python dependencies required to run the code.readme.md
: This file, providing an overview of the project and setup instructions.data/
: Folder containing documents required for optimizing RAG.