This repository contains the source code for my article on medium titled
How to Deploy a Machine Learning API on AWS Lightsail
Medium Link: https://aws.plainenglish.io/how-to-deploy-a-machine-learning-api-on-aws-lightsail-151052470b7d
The repository file structure is given below:
├── app.py
├── docker-compose.yml
├── Dockerfile
├── download.py
├── LICENSE
├── README.md
├── requirements.txt
└── utils.py
This file contains the important source code for building and running our FastAPI application. This is what the file contains:
The API has three endpoints;
- The GET endpoint '/' which is just the homepage
- The '/document-classifier' endpoint. This endpoint takes a PDF file as input and then returns a JSON response with the classes the PDF file belongs to.
- The '/classify-image' endpoint. This endpoint takes an image file as input and then returns a JSON response with the classes the image file belongs to.
A docker-compose.yml is a config file for Docker Compose. It allows to deploy, combine, and configure multiple docker containers at the same time. In this case I used it to run the Docker container for the project locally.
The Dockerfile contains all the commands needed to to build and run our Docker image.
The Dockerfile performs the following steps:
- Grabs the python:3.8.13-slim-bullseye image from Docker hub. This is called the base image.
- Creates a work directory called app.
- Installs the poppler-utils package. This package is needed by Linux in order for us to work with and manipulate PDF files.
- Upgrades the setuptools library
- Installs the CPU version of Pytorch and other requirements.
- Copies all the files from our current directory to the Docker builder.
- Exposes the port 8000. This is a very important step that allows our container to be visible on AWS Lightsail.
- Runs the download.py file.
- Launches the FastAPI application using uvicorn.
This file downloads the pre-trained model from Hugging Face.
This file contains the necessary requirements for our API.
This file contains the functions that generate the classifications for the PDF and image files.
Running on Local Machine with Docker Compose
You can also run the application in a docker container using docker compose(if you have it installed)
- Clone the repository:
git clone https://github.com/Nneji123/Deploy-ML-Models-using-FastAPI-and-AWS-Lightsail.git
- Change the directory:
cd Deploy-ML-Models-using-FastAPI-and-AWS-Lightsail
- Run the docker compose command
docker compose up --build
And then you should be able to view the API on port 8080
Running in a Gitpod Cloud Environment
Click the button below to start a new development environment:
For a detailed explanation on deployment, read up my article on medium: https://aws.plainenglish.io/how-to-deploy-a-machine-learning-api-on-aws-lightsail-151052470b7d