This repository contains a set of scripts that are useful when developing AWS Lambda backed microservices that rely on AWS CloudFormation and AWS API Gateway.
Install the tools via npm, this will make the following commands available in the directory that you ran the install command in (optionally, pass in -g
to install the commands globally).
npm install lambda-tools
All scripts may make use of a .lambda-tools-rc.json
file in the root of the project (that is, the location of package.json
that is closest to Lambda functions). This allows defining some meaningful defaults for the scripts, such as a default stage, region and a project name. An example content of said file could be
{
"project": {
"name": "Project Name"
},
"lambda": {
"runtime": "nodejs4.3"
},
"aws": {
"region": "us-east-1",
"stage": "dev"
}
}
These defaults are used for deployment and running the service locally, which is useful for example when creating dynamic resource names that rely on the stage and project names.
In order for the scripts to work properly, the following structure is assumed for a service
.
├── api.json - Swagger API definition (optional), used by lambda deploy and run
├── cf.json - CloudFormation template, shouldn't include Lambda functions, API Gateway or IAM roles
├── lambda_policies.json - Additional AWS IAM policies for the Lambda functions (optional)
├── package.json - By default all services are assumed to be NPM packages
├── .lambda-tools-rc.json - Configuration file for lambda-tools, can contain default values for scripts to use
├── lambdas
│ └── lambda_name
│ ├── cf.json - Overrides for Lambda function properties (such as memory size or timeout length)
│ └── index.js - Default entrypoint for Lambda function (can be overriden by specifying handler in cf.json)
└── package.json
As all Lambda functions are bundled and compressed during deployment, it is safe to share common code between Lambda functions in the top level of the microservice, for example in a directory called common
or lib
. Achieving this structure is easier by using Yeoman and the generator-lambda-tools
generators.
A minimal example of a service is implemented under examples/microservice
.
All scripts assume that AWS credentials have been configured in a way that is reachable by the AWS Node.js SDK. Lamba Tools does not provide a way to provide custom credentials.
The actions the user executing the scripts should be able to perform are:
setup
iam:GetRole
iam:CreateRole
iam:PutRolePolicy
lambda:GetFunction
lambda:CreateFunction
lambda:UpdateFunctionCode
lambda:UpdateFunctionConfiguration
lambda:GetAlias
lambda:UpdateAlias
lambda:CreateAlias
deploy
s3:CreateBucket
s3:PutObject
cloudformation:DescribeStacks
cloudformation:UpdateStack
cloudformation:CreateStack
lambda:*
- Required transitively by CloudFormation for creating the Lambda functionsapigateway:*
- Required transitively by CloudFormation for creating API Gateway instance- + any permissions that are required by resources in the CloudFormation template
deploy-single
lambda:UpdateFunctionCode
run
- N/A
execute
- N/A
This step should only ever be run once for AWS account and region combination. The step will create the necessary Lambda function that acts as the CloudFormation resource for all stacks created by lambda-tools. If no region is defined, us-east-1
is assumed. If this step is not done, services with an api.json
file will fail to deploy.
lambda setup [options]
Deploying a service to AWS
lambda deploy [options]
Deployment of a service to AWS, goes through multiple steps during the process:
- Locally processes Lambda functions, using browserify and uglify to optimise the performance of the resulting functions
- Completes the CloudFormation template in
cf.json
. This is used for raising/updating the stack on AWS - Uploads Lambda function code, API definition (if any) and the compiled CloudFormation template to S3
- Creates/Updates the CF stack using the template and assets in S3
A single Lambda function can be deployed without using CloudFormation via lambda deploy-single
. This simply updates the Lambda function code. The script assumes that the Lambda function already exists.
lambda deploy-single function-name [options]
Deploying a single Lambda function directly to AWS Lambda. Processes the Lambda function as described in deploy
, thus reducing the size of the function. Doesn't upload the function to S3. Assumes the handler of the function is in index.handler
, you can change the entrypoint file via the -f
option.
Both deploy
and deploy-single
implement a caching logic to avoid the costly transpiling process of Lambda functions. This cache generates a manifest for the Lambda function by bundling all of its code into a single file and generating a checksum of it. The manifest also includes a dependency tree, which is used for --exclude
. If the manifest matches the previous deployment, the ZIP file is reused. To circumvent this reuse policy, use the --clean
flag, which forces a rebundling/transpiling.
lambda execute [options] lambda-function
Execute a single Lambda function. The lambda-function
argument can be specified in multiple ways:
- As a file path, in which case the file is assumed to be the module that exports the
handler
function - As a file name, in which case the file is expected to exist in the current directory
- If executed inside of a service,
lambda-function
can be the name of the function to execute, i.e the name of the subdirectory inlambdas
, where the Lambda function is.
In any of these cases, an event file is located as follows:
- Relative to the Lambda handler file location, if there is an
event.json
in the same directory - If the
-e
option is used, it is checked relative to the current working directory - If neither of those exists, then an empty event is used as a fallback. Similarly, if either file fails to parse as valid JSON.
By default, the event file is assumed to be event.json
and the timeout is set to 6 seconds. The environment is empty (i.e the running environment is not mirrored).
lambda run [options]
Running a service locally. This should be used strictly for development purposes as the code that simulates AWS is imperfect (at best) and is not guaranteed to respond similarly to the actual Lambda environment. It does however do its best to allow locally debugging lambda functions sitting behind an API gateway.
The command starts a local server, which parses the API spec (defaults to ./api.json
) and creates appropriate routes, all invalid routes return 404
. The server also mimics AWS's logic in creating the integration (i.e it maps the incoming HTTP request into an AWS Lambda integration), as well as mapping the result of the Lambda function into an appropriate HTTP response.