genkio/blog

Serverless with AWS Lambda in action

genkio opened this issue · 0 comments

Application logic

  1. User uploads a csv file onto AWS S3 bucket.
  2. Upon file uploaded, S3 bucket invokes the lambda function we're about to create.
  3. Our lambda function reads csv file content, then log it out into the console. In a real world scenario, this service might do something like validating or formatting the csv data.

Local environment

  • Serverless framework version 1.15.2 (installed globally)
  • Node.js version 6.10.1

Getting started

Create a new project using Serverless framework

$ sls create --template aws-nodejs --path csv-service

Update serverless.yml

service: csv-service
provider:
  name: aws
  runtime: nodejs6.10
  region: ap-northeast-1 # specify region
  iamRoleStatements:
    - Effect: Allow
      Action:
        - s3:*
      Resource: "arn:aws:s3:::csv-bucket/*"
functions:
  hello:
    handler: handler.hello
    timeout: 20
    events:
      - s3: # in this case, instead of http call, s3 will trigger the event to invoke our lambda function
          bucket: csv-bucket
          event: s3:ObjectCreated:* # the hello function will be invoked automatically upon file uploaded
          rules:
            - prefix: uploaded/ # the folder in the bucket will be watched
            - suffix: .csv

Write our lambda function

// handler.js
'use strict'

const aws = require('aws-sdk');
const s3 = new aws.S3();

module.exports.hello = (event, context, callback) => {
  console.log(JSON.stringify(event)) // we will need this later for testing locally 
  callback(null, {})

  const Bucket = event.Records[0].s3.bucket.name
  const Key = event.Records[0].s3.object.key
  const handler = (err, data) => {
    if (err) {
      context.done('error', `error getting file ${err}`)
    } else {
      console.log(String(data.Body))
    }
  }

  s3.getObject({ Bucket, Key }, handler)
}

Deploy

As easy as pie

$ sls deploy

Now, our csv service should be working, after the csv uploaded (to the uploaded folder of bucket csv-bucket), if you go to CloudWatch, you should be seeing the csv file content in the logs.

Testing

In order to test our function locally without deploy every time there's a code change, we'll first create a event.json file, in which, we'll simply paste in the event object when lambda function invoked, then:

$ sls invoke local --f hello --p event.json
{
  "Records": [
      {
          "eventVersion": "2.0",
          "eventSource": "aws:s3",
          "awsRegion": "ap-northeast-1",
          "eventTime": "2017-06-12T07:55:21.559Z",
          "eventName": "ObjectCreated:Put",
          "userIdentity": {
              "principalId": "AWS:RNDSTRING:user"
          },
          "requestParameters": {
              "sourceIPAddress": "*.*.*.*"
          },
          "responseElements": {
              "x-amz-request-id": "RNDSTRING",
              "x-amz-id-2": "RNDSTRING"
          },
          "s3": {
              "s3SchemaVersion": "1.0",
              "configurationId": "*-*-*-*-*",
              "bucket": {
                  "name": "csv-bucket",
                  "ownerIdentity": {
                      "principalId": "RNDSTRING"
                  },
                  "arn": "arn:aws:s3:::csv-bucket"
              },
              "object": {
                  "key": "uploaded/test.csv",
                  "size": 659215,
                  "eTag": "RNDSTRING",
                  "sequencer": "00593E48E971F07485"
              }
          }
      }
  ]
}

Debug

Set a debugger before the data being console out.

debugger
console.log(String(data.Body))

Then run node

$ node --inspect --debug-brk /usr/local/bin/sls invoke local --f hello --p event.json

Open the chrome-devtools:// url in a new Chrome tab, that's it, happy hacking :)