Lightning-Hydra-Template with CIFAR Dataset with Optuna + tensorboard

python pytorch lightning hydra

Experiment Results

Training stopped in-between because of CPU training (with less epochs) & failed in colab instance

Best HyperParameter from experiment

name: optuna
best_params:
   model.optimizer._target_: torch.optim.Adam
   model.optimizer.lr: 0.080057
   datamodule.batch_size: 8

DVC Steps

dvc add data
dvc add logs
dvc push -r gdrive

Link : https://drive.google.com/drive/folders/1xV0qTuvvboYnyRspw5aeMos06WOKs5Mg?usp=sharing 1

Hyper-parameter search Optuna

  • Set hyper-parameters for experiment tracking
  • Find the best batch_size and learning rate, and optimizer
  • Optimizers have to be one of Adam, SGD, RMSProp 1

Tensorboard Results

https://tensorboard.dev/experiment/HZTVAzfkSDC8EfVfAW99NQ/

1 2