/sample-mnist-digit-recognition-sagemaker

Demo to run the MNIST handwritten digit model on a locally running SageMaker endpoint

Primary LanguageJavaScriptApache License 2.0Apache-2.0

MNIST handwritten digit recognition model running on a local SageMaker endpoint

Key Value
Environment
Services S3, SageMaker, Lambda
Integrations AWS SDK
Categories Serverless, S3 website, Lambda function URLs, SageMaker, Machine Learning, JavaScript, Python
Level Intermediate

Introduction

This is a sample application that demonstrates how to use SageMaker on LocalStack. A simple web frontend allows users to draw a digit and submit it to a locally running SageMaker endpoint. The endpoint returns a prediction of the digit, which is then displayed in the web frontend. Request handling is performed by a Lambda function, accessible via a function URL, that uses the SageMaker SDK to invoke the endpoint.

Here's a short summary of AWS service features we use:

  • S3 website
  • Lambda function URLs
  • SageMaker endpoint

Here's the web application in action:

Screen.Recording.2023-04-27.at.16.04.43.mov

Architecture overview

Architecture Diagram

Prerequisites

Dev environment

Create a virtualenv and install all the development dependencies there:

python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt

If you'd like to perform training locally, you'll need to install the ml dev dependencies as well:

pip install -r ml/requirements.txt

You'll also need npm/node installed to build the web application. Please install according to official guidelines: https://github.com/nvm-sh/nvm

Download pytorch container image

As our inference container, we use the PyTorch inference container from the AWS ECR.

aws ecr get-login-password --region eu-central-1 | docker login --username AWS --password-stdin 763104351884.dkr.ecr.eu-central-1.amazonaws.com
docker pull 763104351884.dkr.ecr.eu-central-1.amazonaws.com/pytorch-inference:1.10.2-cpu-py38-ubuntu20.04-sagemaker

LocalStack

Start LocalStack Pro with your Auth Token:

PERSISTENCE=1 LOCALSTACK_AUTH_TOKEN=... localstack start

Instructions

First, we install the dependencies for the Web application in the web directory:

(cd web; npm install)

You can then create the AWS infrastructure on LocalStack by running the deploy/deploy_app.py script (make sure to have the virtual environment activated):

source .venv/bin/activate
python deploy/deploy_app.py

This script will create the SageMaker endpoint with the model, which it first uploads to a bucket. The script will also create a lambda function that will be used to invoke the endpoint. Finally, the script will build the web application and then create a s3 website to host it.

Using the application

Once deployed, visit http://mnist-website.s3-website.localhost.localstack.cloud:4566

Draw something in the canvas and click on the button that says Predict.

After a few moments the resulting prediction should be displayed in the box to the right.

Demo Picture

Serverless SageMaker Endpoint

To switch to a serverless SageMaker endpoint you can also execute the deployment script with the additional -s or --serverless flag:

python deploy/deploy_app.py --serverless

License

The code of this sample application is published under the Apache 2.0 license (see LICENSE).

Contributing

We appreciate your interest in contributing to our project and are always looking for new ways to improve the developer experience. We welcome feedback, bug reports, and even feature ideas from the community. Please refer to the contributing file for more details on how to get started.