This demo shows how to deploy LiteLLM-Proxy on AWS Lambda using Lambda Web Adapter to provide an OpenAI compatable API for Amazon Bedrock.
- AWS CLI
- AWS SAM CLI
- Docker
Run the following commands to build and deploy this demo.
sam build
sam deploy --guided
You need to enter an random string for ApiMasterKey
parameter. And take note of the litellmProxyFunctionUrl
in output. You will use it in the testing.
Run this command to test it. Replace litellmProxyFunctionUrl
and ApiMasterKey
with the correct values. And you should see the response stream back.
curl <litellmProxyFunctionUrl>chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer <ApiMasterKey>" \
-d '{
"model": "bedrock/anthropic.claude-v2",
"messages": [
{
"role": "system",
"content": "You are a helpful assistant."
},
{
"role": "user",
"content": "tell me a bedtime story about lambda and sqs"
}
],
"stream": true
}'