/mlops-demo

This repository showcases the MLOps use case of fast and efficient incorporation of production data.

Primary LanguagePython

MLOps Demo

MLOps Demo

How to set up

First of all, you need the Google Cloud Credentials JSON for the Computing Engine Default Service Account. If you don't have one, create it here. The key needs to be named key.json and put into the backend Directory.

Create a Vertex AI Endpoint

  1. Go to Vertex AI Model Registry
  2. Select newest Version
  3. Go to 'Deploy & Test'
  4. Click on 'Deploy to Endpoint'
  5. Enter a Name for the new Endpoint
  6. Go to 'Model Settings' and select 'n1-standard-2' for Machine Type
  7. Click on 'Deploy'
  8. Go to Vertex AI Endpoints and copy the ID of our newly created Endpoint.

Create Kubeflow Pipelines Instance

  1. Go to Kubeflow Pipelines
  2. Select 'Configure'
  3. Create a new cluster by selecting europe-west1-c as zone and ticking the box at 'Allow access to the following Cloud APIs'
  4. When the cluster is created, click on 'Deploy'
  5. After Deployment, go to AI Platform Pipelines
  6. Click 'Open Pipelines Dashboard' on you newly created Instance
  7. Copy the URL

Prepare Dockerfile

  1. Go to backend/Dockerfile
  2. Set ENDPOINT_ID to our Vertex AI Endpoint ID
  3. Set KUBEFLOW_URL to our Kubeflow Pipelines Instance URL
  4. Start the Docker Container

After the preparation, you can run the Webapp by running npm start in the frontend directory