Companion Reading: Creating a (mostly) Autonomous HR Assistant with ChatGPT and LangChain’s Agents and Tools
I made this prototype using Azure deployments as my company is an Azure customer.
I created a backend file called 'hr_agent_backend_local.py' for those that does not want to use Azure.
This is does not use any Azure components - the API is from platform.openai.com, the csv file is stored locally(i.e. on your own computer)
- Install python 3.10. Windows, Mac
- Clone the repo to a local directory.
- Navigate to the local directory and run this command in your terminal to install all prerequisite modules - 'pip install -r requirements.txt'
- Input your own API keys in the hr_agent_backend_local.py file (or hr_agent_backend_azure.py if you want to use the azure version; just uncomment it in the frontend.py file)
- Run 'streamlit run hr_agent_frontent.py' in your terminal
- Create a Pinecone account in pinecone.io - there is a free tier. Take note of the Pinecone API and environment values.
- Run the notebook 'store_embeddings_in_pinecone.ipynb'. Replace the Pinecone and OpenAI API keys (for the embedding model) with your own.
Azure OpenAI Service - the OpenAI service offering for Azure customers.
LangChain - development frame work for building apps around LLMs.
Pinecone - the vector database for storing the embeddings.
Streamlit - used for the front end. Lightweight framework for deploying python web apps.
Azure Data Lake - for landing the employee data csv files. Any other cloud storage should work just as well (blob, S3 etc).
Azure Data Factory - used to create the data pipeline.
SAP HCM - the source system for employee data.
Feel free to connect with me on:
Linkedin: https://www.linkedin.com/in/stephenbonifacio/
Twitter: https://twitter.com/Stepanogil