AWS Lambda Rust docker builder π π³ π¦
This docker image extends lambda ci provided
builder docker image, a faithful reproduction of the actual AWS "provided" Lambda runtime environment,
and installs rustup and the stable rust toolchain.
Tags for this docker image follow the naming convention softprops/lambda-rust:{version}-rust-{rust-stable-version}
where {rust-stable-version}
is a stable version of rust.
You can find a list of available docker tags here
π‘ If you don't find the version you're looking for, please open a new github issue to publish one
You can also depend directly on softprops/lambda-rust:latest
for the most recently published version.
The default docker entrypoint will build a packaged release optimized version your Rust artifact under target/lambda/release
to
isolate the lambda specific build artifacts from your host-local build artifacts.
β οΈ Note: you can switch from therelease
profile to a custom profile likedev
by providing aPROFILE
environment variable set to the name of the desired profile. i.e.-e PROFILE=dev
in your docker run
β οΈ Note: you can include debug symbols in optimized release build binaries by settingDEBUGINFO
. By default, debug symbols will be stripped from the release binary and set aside in a separate .debug file.
You will want to volume mount /code
to the directory containing your cargo project.
You can pass additional flags to cargo
, the Rust build tool, by setting the CARGO_FLAGS
docker env variable
A typical docker run might look like the following.
$ docker run --rm \
-v ${PWD}:/code \
-v ${HOME}/.cargo/registry:/root/.cargo/registry \
-v ${HOME}/.cargo/git:/root/.cargo/git \
softprops/lambda-rust
π‘ The -v (volume mount) flags for
/root/.cargo/{registry,git}
are optional but when supplied, provides a much faster turn around when doing iterative development
If you are using Windows, the command above may need to be modified to include
a BIN
environment variable set to the name of the binary to be build and packaged
$ docker run --rm \
-e BIN={your-binary-name} \
-v ${PWD}:/code \
-v ${HOME}/.cargo/registry:/root/.cargo/registry \
-v ${HOME}/.cargo/git:/root/.cargo/git \
softprops/lambda-rust
If you want to set up ad hoc lambda functions or have another reason to not to go with full blown devops orchestration tools, there's a cargo subcommand to compile your code into a zip file and deploy it to an existing function. This comes with only rust and docker as dependencies.
Setup
$ cargo install cargo-aws-lambda
To compile and deploy in your project directory
$ cargo aws-lambda {your aws function's full ARN} {your-binary-name}
To list all options
$ cargo aws-lambda --help
More instructions can be found here.
Once you've built a Rust lambda function artifact, the provided
runtime expects
deployments of that artifact to be named "bootstrap". The lambda-rust
docker image
builds a zip file, named after the binary, containing your binary files renamed to "bootstrap"
You can invoke this bootstap executable with the lambda-ci docker image for the provided
AWS lambda runtime.
# start a docker container replicating the "provided" lambda runtime
# awaiting an event to be provided via stdin
$ unzip -o \
target/lambda/release/{your-binary-name}.zip \
-d /tmp/lambda && \
docker run \
-i -e DOCKER_LAMBDA_USE_STDIN=1 \
--rm \
-v /tmp/lambda:/var/task \
lambci/lambda:provided
# provide an event payload via stdin (typically a json blob)
# Ctrl-D to yield control back to your function
Doug Tangren (softprops) 2018