Testing deployment of a hypothetical ML model which uses LightGBM.
Model is served with Flask as an API service. Script is containerized with Docker and hosted using AWS Beanstalk. API link can be used to pass paramaters in GET Request from an independent front-end application.
- The web app that uses this model as a service can be found here.
To test out various MLops ML model monitoring tools to keep track of the model's performance.