LLM Powered Efficiency: Smart Workflow, Make Yours!
This is a user-friendly interface designed to streamline document comprehension and retrieval. Leveraging the Retrieval Augmented Generation (RAG) model, users can effortlessly upload their documents and interact with the chatbot to ask pertinent questions about the content. Whether it's extracting specific information, summarizing key points, or seeking clarification, this project empowers users to efficiently navigate through their documents with ease.
- Upload Documents: Simply upload your documents to the chatbot interface.
- Natural Language Interaction: Interact with the chatbot using natural language queries.
- Retrieval Augmented Generation (RAG): Benefit from the advanced capabilities of RAG for accurate document comprehension.
- Supports Common Document Formats: Supports common document formats. (PDF, Microsoft Word, Powerpoint, Excel and ODT)
- Efficient Workflow: Save time and effort by quickly accessing relevant information within your documents.
- User-Friendly Interface: Built with Streamlit for an intuitive and interactive user experience.
To get started with LMPoC Frontend, follow these simple steps:
- Clone the repository to your local machine.
- Execute frontend starting script, it will create a virtual environment within the project folder and install requirements.
- Python 3.x
- Run
python start-ui.sh
to start the application. - You can also use
screen -S streamlit_session -d -m bash -c 'sudo bash ./start-ui.sh'
to run frontend ui in screen environment to preserve session.
Contributions are welcome! If you have any ideas for improvements, feature requests, or bug reports, please open an issue or submit a pull request.
This project is licensed under the MIT License.