My repo for the fourth nanodegree project - Operationalize a Machine Learning Microservice API.
This project operationalises a Python flask app that serves out predictions (inference) about housing prices through API calls.
It contains the following files/folders:
.circleci
folder - CircleCI configurationoutput_txt_files
folder - Two.txt
files with terminal results from runningrun_docker.sh
andrun_kubernetes.sh
templates
folder - frontend templates for Flask appapp.py
- Flask apprun_docker.sh
- run Docker container locallyrun_kubernetes.sh
- run Docker container with Kubernetes (minikube)upload_docker
- upload Docker image to repository (DockerHub)make_prediction.sh
- run prediction POST requests- Others -
Dockerfile
,Makefile
,requirements.txt
- Setup python virtual environment - run these commands:
python3 -m venv ~/.[name] # create python VE
source ~/.[name]/bin/activate #activate
(Replace name
with preferred environment name)
- Setup other requirements - Docker, Hadolint, Minikube
- Install project dependencies and run lint checks for errors in Dockerfile
make install
make lint
- Run a local container and test if app is successfully launched -
./run_docker.sh
- Run prediction -
./make_prediction.sh
- Upload Docker image to remote repository (DockerHub) - run script
./upload_docker.sh
- Start minikube cluster - run
minikube start
- Deploy container with Minikube - run script
./run_kubernetes.sh
- Run prediction -
./make_prediction.sh
- (Extras) Run puthon script alone -
python app.py
All steps as above, modify ./run_kubernetes.sh
script to match installed minikube configuration on your machine.
- Amazon AWS - Cloud services
- AWS CLI - AWS Command-line tool
- Cloud9 - Cloud-based IDE
- Circle CI - Cloud-based CI/CD service
- Docker - Containerisation tool
- Kubernetes - Container Orchestration tool
- Minikube - Run Kubernetes clusters locally