pip install aurora-data-api
Set up an AWS Aurora Serverless cluster and enable Data API access for it. If you have previously set up an Aurora Serverless cluster, you can enable Data API with the following AWS CLI command:
aws rds modify-db-cluster --db-cluster-identifier DB_CLUSTER_NAME --enable-http-endpoint --apply-immediately
Save the database credentials in AWS Secrets Manager using a format expected by the Data API (a JSON object with the keys
username
andpassword
):aws secretsmanager create-secret --secret-id rds-db-credentials/MY_DB aws secretsmanager put-secret-value --secret-id rds-db-credentials/MY_DB --secret-string "$(jq -n '.username=env.PGUSER | .password=env.PGPASSWORD')"
Configure your AWS command line credentials using standard AWS conventions. You can verify that everything works correctly by running a test query via the AWS CLI:
aws rds-data execute-statement --resource-arn RESOURCE_ARN --secret-arn SECRET_ARN --sql "select * from pg_catalog.pg_tables"
- Here, RESOURCE_ARN refers to the Aurora RDS database ARN, which can be found in the
AWS RDS Console (click on your database, then "Configuration")
or in the CLI by running
aws rds describe-db-clusters
. SECRET_ARN refers to the AWS Secrets Manager secret created above. - When running deployed code (on an EC2 instance, ECS/EKS container, or Lambda), you can use the managed IAM policy AmazonRDSDataFullAccess to grant your IAM role permissions to access the RDS Data API (while this policy is convenient for testing, we recommend that you create your own scoped down least-privilege policy for production applications).
- Here, RESOURCE_ARN refers to the Aurora RDS database ARN, which can be found in the
AWS RDS Console (click on your database, then "Configuration")
or in the CLI by running
Use this module as you would use any DB-API compatible driver module. The aurora_data_api.connect()
method is
the standard main entry point, and accepts two implementation-specific keyword arguments:
aurora_cluster_arn
(also referred to asresourceArn
in the Data API documentation)- If not given as a keyword argument, this can also be specified using the
AURORA_CLUSTER_ARN
environment variable
- If not given as a keyword argument, this can also be specified using the
secret_arn
(the database credentials secret)- If not given as a keyword argument, this can also be specified using the
AURORA_SECRET_ARN
environment variable
- If not given as a keyword argument, this can also be specified using the
import aurora_data_api
cluster_arn = "arn:aws:rds:us-east-1:123456789012:cluster:my-aurora-serverless-cluster"
secret_arn = "arn:aws:secretsmanager:us-east-1:123456789012:secret:rds-db-credentials/MY_DB"
with aurora_data_api.connect(aurora_cluster_arn=cluster_arn, secret_arn=secret_arn, database="my_db") as conn:
with conn.cursor() as cursor:
cursor.execute("select * from pg_catalog.pg_tables")
print(cursor.fetchall())
The cursor supports iteration (and automatically wraps the query in a server-side cursor and paginates it if required):
with conn.cursor() as cursor:
for row in cursor.execute("select * from pg_catalog.pg_tables"):
print(row)
The RDS Data API is the link between the AWS Lambda serverless environment and the sophisticated features provided by PostgreSQL and MySQL. The Data API tunnels SQL over HTTP, which has advantages in the context of AWS Lambda:
- It eliminates the need to open database ports to the AWS Lambda public IP address pool
- It uses stateless HTTP connections instead of stateful internal TCP connection pools used by most database drivers (the stateful pools become invalid after going through AWS Lambda freeze-thaw cycles, causing connection errors and burdening the database server with abandoned invalid connections)
- It uses AWS role-based authentication, eliminating the need for the Lambda to handle database credentials directly
- Project home page (GitHub)
- Documentation (Read the Docs)
- Package distribution (PyPI)
- Change log
- sqlalchemy-aurora-data-api, a SQLAlchemy dialect that uses aurora-data-api
Please report bugs, issues, feature requests, etc. on GitHub.
Licensed under the terms of the Apache License, Version 2.0.