Welcome to the Azure Machine Learning examples repository!
- An Azure subscription. If you don't have an Azure subscription, create a free account before you begin.
- A terminal and Python >=3.6,<3.9.
Clone this repository and install required packages:
git clone https://github.com/Azure/azureml-examples --depth 1
cd azureml-examples
pip install --upgrade -r requirements.txt
To create or setup a workspace with the assets used in these examples, run the setup script.
If you do not have an Azure ML Workspace, run
python setup-workspace.py --subscription-id $ID
, where$ID
is your Azure subscription id. A resource group, Azure ML Workspace, and other necessary resources will be created in the subscription.If you have an Azure ML Workspace, install the Azure ML CLI and run
az ml folder attach -w $WS -g $RG
, where$WS
and$RG
are the workspace and resource group names.Run
python setup-workspace.py -h
to see other arguments.
To get started, see the introductory tutorial which uses Azure ML to:
- run a
"hello world"
job on cloud compute, demonstrating the basics - run a series of PyTorch training jobs on cloud compute, demonstrating mlflow tracking & using cloud data
These concepts are sufficient to understand all examples in this repository, which are listed below.
A lightweight template repository for automating the ML lifecycle can be found here.
directory | description |
---|---|
.cloud |
cloud templates |
.github |
GitHub specific files like Actions workflow yaml definitions and issue templates |
notebooks |
interactive jupyter notebooks for iterative ML development |
tutorials |
self-contained directories of end-to-end tutorials |
workflows |
self-contained directories of job to be run, organized by scenario then tool then project |
Tutorials
Notebooks
path | status | description |
---|---|---|
notebooks/train-lightgbm-local.ipynb | use mlflow for tracking local notebook experimentation in the cloud |
Train
path | status | description |
---|---|---|
workflows/train/deepspeed/cifar/job.py | train CIFAR-10 using DeepSpeed and PyTorch | |
workflows/train/fastai/mnist-mlproject/job.py | train fastai resnet18 model on mnist data via mlflow mlproject | |
workflows/train/fastai/mnist/job.py | train fastai resnet18 model on mnist data | |
workflows/train/fastai/pets/job.py | train fastai resnet34 model on pets data | |
workflows/train/lightgbm/iris/job.py | train a lightgbm model on iris data | |
workflows/train/pytorch/mnist-mlproject/job.py | train a pytorch CNN model on mnist data via mlflow mlproject | |
workflows/train/pytorch/mnist/job.py | train a pytorch CNN model on mnist data | |
workflows/train/scikit-learn/diabetes-mlproject/job.py | train sklearn ridge model on diabetes data via mlflow mlproject | |
workflows/train/scikit-learn/diabetes/job.py | train sklearn ridge model on diabetes data | |
workflows/train/tensorflow/mnist-distributed-horovod/job.py | train tensorflow CNN model on mnist data distributed via horovod | |
workflows/train/tensorflow/mnist-distributed/job.py | train tensorflow CNN model on mnist data distributed via tensorflow | |
workflows/train/tensorflow/mnist/job.py | train tensorflow NN model on mnist data | |
workflows/train/transformers/glue/1-aml-finetune-job.py | Submit GLUE finetuning with Huggingface transformers library on Azure ML | |
workflows/train/transformers/glue/2-aml-comparison-of-sku-job.py | Experiment comparing training performance of GLUE finetuning task with differing hardware. | |
workflows/train/transformers/glue/3-aml-hyperdrive-job.py | Automatic hyperparameter optimization with Azure ML HyperDrive library. | |
workflows/train/xgboost/iris/job.py | train xgboost model on iris data |
Deploy
path | status | description |
---|---|---|
workflows/deploy/pytorch/mnist/job.py | deploy pytorch cnn model trained on mnist data to aks | |
workflows/deploy/scikit-learn/diabetes/job.py | deploy sklearn ridge model trained on diabetes data to AKS |
We welcome contributions and suggestions! Please see the contributing guidelines for details.
This project has adopted the Microsoft Open Source Code of Conduct. Please see the code of conduct for details.