Description: A chatbot for answering questions concerning Siriraj doctors and clinics as well as scheduling hospital visits. This chatbot was built using Vertex AI models.
- Data extraction using web scraping
- Index creation and data querying using Llama Index and Langchain
- Web service development using Flask
- Web deployment and UI development using React.js (See Front-end section)
See readme.md
the folder scripts
.
Ran every cell in the tutorial_notebooks/index_creation.ipynb
to create indices.
Create a .env
file in the folder webserver
. The file content must consist of the following
SERVICE_ACCOUNT_PATH="<path to GCP credential with vertex AI admin permission>"
MERGING_INDEX_DIR="./tutorial_notebooks/merging_index"
CLINIC_INDEX_DIR="./tutorial_notebooks/clinic_index"
CLINIC_DOCTOR_INDEX_DIR="./tutorial_notebooks/clinic_doctor_index"
PORT=8876
This .env
file is used by the webservice as settings. All of them except PORT specify the location of index directory or gcp credential.
The authors used python 3.11.7 on Windows 11. The python environment used is in the environment.yml
which can be install easily with conda
.
conda env create -f environment.yml # installing env
conda activate llama_env # using env
However, we have one of our teammate with MacOS unable to install this python env. We guessed that some library need pywin32
which cannot be install else where beside Windows (since it is Windows APIs) and we did not have time to fix this, so we leave it as is. Good luck hunting this bug 🤦♂️🤦♀️.
After activate the python environment mentioned in the previous section. The command below will start the webservice on port 8876 (change this on .env
file).
python webserver/app.py
Send a HTTP POST request to the /chatquery
to start chatting. The request body must be a json that consist of
{
"id": "id of message, we use uuid4 but it can be any",
"timestamp": "datetime in ISO",
"text": "text to be chat with the chatbot",
"sender": "name to identify sender. keep name the same and the bot will can memorize who it just talk with."
}
It will return json of the following
{
"id": "id of message, we use uuid4 but it can be any",
"timestamp": "datetime in ISO",
"text": "text that chat bot answer",
"sender": "chat bot" // we fixed this
}
We also develop a front end UI that is meant to be integrated with this repository. See github.com/makorn645/muai-2024-majestic-mustangs-frontend.
The member of Majestic Mustangs