This package contains a service api template that deploys a langchain based generative model API backed by a lambda and api gateway. Package also contains a streamlit demo web app that can connect with the deployed api to test it in a web app.
- nodejs 18+
- Python 3.9+
- aws-cdk toolkit (
npm install -g aws-cdk
) - AWS account configured with credentials (https://docs.aws.amazon.com/cdk/v2/guide/getting_started.html#getting_started_prerequisites)
- openai api key saved in Secrets Manager in your AWS Account
- Expected secret name is
api-keys
- openai key is expected to be stored with
openai-api-key
key
- Expected secret name is
Clone the repository
git clone https://github.com/3coins/langchain-aws-template.git
Install the dependencies, this creates a conda env named langchain-aws-service
and activates it
conda env create -f service/environment.yml
conda activate langchain-aws-service
Deploy the stack to your AWS console.
make deploy
Note the api-id and resource-id from the deployment step
aws apigateway test-invoke-method --rest-api-id <api-id> \
--http-method POST \
--body '{"prompt": "explain code: print(\"Hello world\")", "session_id": ""}' \
--resource-id <resource-id> \
--output json
You can also run the streamlit app to test the api in a web app
Install dependencies, this creates a conda env named langchain-aws-streamlit
and activates it
conda env create -f streamlit_app/environment.yml
conda activate langchain-aws-streamlit
Make sure to update the <your-api-endpoint>
in streamlit_app/api.py
to your api gateway endpoint.
Run the streamlit app, this will open the web app in the browser.
make run