/Python-MLOps-Cookbook

This is an example of a Containerized Flask Application that can deploy to many target environments including: AWS, GCP and Azure.

Primary LanguageJupyter NotebookCreative Commons Zero v1.0 UniversalCC0-1.0

Python application test with Github Actions [Python application test with AWS Code Builde

Python MLOps Cookbook

This is an example of a Containerized Flask Application that can be the core ingredient in many "recipes", i.e. deploy targets..

mlops-color

Github Container Registery

Feel free to test my ML project: docker pull ghcr.io/noahgift/python-mlops-cookbook:latest

Assets in repo

Course2-Duke-Flask-Containerized

CLI Tools

There are two CLI tools. First, the main cli.py is the endpoint that serves out predictions. To predict the height of an MLB player you use the following: ./cli.py --weight 180

predict-height-weight

The second cli tool is utilscli.py', and this performs model retraining, and could serve as the entry point to do more things. For example, this version doesn't change the default model_name`, but you could add that as an option by forking this repo.

./utilscli.py retrain --tsize 0.4

Here is an example retraining the model. model-retraining

Additionally you can query the API via the CLI, allowing you to change both the host and the value passed into the API. This is accomplished through the requests library.

./utilscli.py predict --weight 400

predict-cli

Flask Microservice

The Flask ML Microservice can be run many ways.

Containerized Flask Microservice Locally

You can run the Flask Microservice as follows with the commmand: python app.py.

(.venv) ec2-user:~/environment/Python-MLOps-Cookbook (main) $ python app.py 
 * Serving Flask app "app" (lazy loading)
 * Environment: production
   WARNING: This is a development server. Do not use it in a production deployment.
   Use a production WSGI server instead.
 * Debug mode: on
INFO:werkzeug: * Running on http://127.0.0.1:8080/ (Press CTRL+C to quit)
INFO:werkzeug: * Restarting with stat
WARNING:werkzeug: * Debugger is active!
INFO:werkzeug: * Debugger PIN: 251-481-511

To serve a prediction against the application, run the predict.sh.

(.venv) ec2-user:~/environment/Python-MLOps-Cookbook (main) $ ./predict.sh                             
Port: 8080
{
  "prediction": {
    "height_human_readable": "6 foot, 2 inches", 
    "height_inches": 73.61
  }
}

Containerized Flask Microservice

Here is an example of how to build the container and run it locally, this is the contents of predict.sh

#!/usr/bin/env bash

# Build image
#change tag for new container registery, gcr.io/bob
docker build --tag=noahgift/mlops-cookbook . 

# List docker images
docker image ls

# Run flask app
docker run -p 127.0.0.1:8080:8080 noahgift/mlops-cookbook

Automatically Build Container via Github Actions and Push to Github Container Registery

To setup the container build process do the following. This is also covered by Alfredo Deza in Practical MLOps book in greater detail.

  build-container:
    runs-on: ubuntu-latest
    steps:
    - uses: actions/checkout@v2
    - name: Loging to Github registry
      uses: docker/login-action@v1
      with:
        registry: ghcr.io
        username: ${{ github.repository_owner }}
        password: ${{ secrets.BUILDCONTAINERS }}
    - name: build flask app
      uses: docker/build-push-action@v2
      with:
        context: ./
        #tags: alfredodeza/flask-roberta:latest
        tags: ghcr.io/noahgift/python-mlops-cookbook:latest
        push: true 
    

container-registry

Automatically Build Container via Github Actions and Push to Dockerhub Container Registery

Build Targets

With the project using DevOps/MLOps best practices including linting, testing, and deployment, this project can be the base to deploy to many deployment targets.

[In progress....]

Other Tools and Frameworks

[In progress....]

FastAPI

AWS

Elastic Beanstalk

AWS Lambda Recipes

Install SAM as documented here, AWS Cloud9 has it installed already.

You can find the recipes here

AWS Lambda-SAM Local

sam-directory-layout

AWS Lambda-SAM Containerized Deploy

Follow recipe in recipe section.

sam-guided-deploy

When deployed an easy way to verify image is via Console.

invoke-lambda-console

A great way to test the API Endpoint is with the Cloud9 Environment:

invoke-api-gateway

Another way is the the tool "Postman":

post-man

AWS App Runner

Watch a YouTube Walkthrough on AWS App Runner for this repo here: https://www.youtube.com/watch?v=zzNnxDTWtXA

mlops

AWS Co-Pilot

Following setup here and then deploy project using cli https://docs.aws.amazon.com/AmazonECS/latest/developerguide/getting-started-aws-copilot-cli.html

GCP

Cloudrun (CaaS: Container as a Service)

It is trivial (if you select project):

gcloud config set project <yourprojectname>

A. Get GCP Account B. Checkout project C. cloud run deploy inside of project D. Verify it works by using ./utilscli.py

gcp-cloud-run

App Engine

GKE (Kubernetes)

Azure App Services

Production Patterns

[In progress....]

  • Cached model (deploy)
  • Load-testing

DataScience Workflow

mlb-ht-wt

This repository is focused on MLOps. To see more about Data Storytelling, you can go to this Github repo on Data Story Telling

Next Steps: Take Coursera MLOps Course

cloud-specialization