/bo-early-stopping

Primary LanguageJupyter NotebookApache License 2.0Apache-2.0

BO-early-stopping

This is the code repo to reproduce figures and tables in the paper Automatic Termination for Hyperparameter Optmization.

Alt text

Install

It's recommended that users use Python3 and create a virtual environment first. Then one could simply run the following to install all the dependencies.

./install.sh

Generating artifacts

To generate the figures and tables in the paper, one could open the notebook:

jupyter notebook table_plot.ipynb

To generate raw results in csv files:

PYTHONPATH="." python bin/gen_csv.py

The included termination results are:

  • result_identity_cv_top0.5.csv: for XGB and RF algorithms with cross validation on 19 tabular datasets.
  • bore_results_xgboost_identity_top0.5_hpobench.csv: for BORE optimizers on HPO-Bench.
  • rs_results_identity_top0.5_hpobench.csv: for Random Search on HPO-Bench.
  • tpe_results_identity_top0.5_hpobench.csv: for TPE optimizer on HPO-Bench.
  • bore_results_xgboost_identity_top0.5_nasbench201.csv: for BORE optimizer on NAS-Bench-201.

Description

The bin folder contains code to compute the bound for regret estimation:

  • gen_csv.py for termination results with different automatic termination methods.
  • start_gap_estimation.py for using BO to tune XGB and RF algorithms on 19 tabular datasets with cross validation.
  • start_gap_estimation_baselines.py for using Random Search, TPE, BO, BORE on HPO-Bench and NAS-Bench-201.

The src folder contains:

  • The GP to estimate the bound of the regret (enhanced_gp.py).
  • the stopping rules (stop_methods.py) that are used in the paper.
  • Other util code constant.py and util.py

The regrets folder contains the regrets upper bound estimation, that will be used for our termination method.

  • regrets/regrets_identity_cv_top0.5 for using BO to tune XGB and RF algorithms on 19 tabular datasets with cross validation.
  • regrets/regrets_identity_rs_top0.5 and regrets/regrets_identity_tpe_top0.5 for using Random Search and TPE on HPO-Bench.
  • regrets/regrets_identity_bore_xgboost_top0.5 for using BORE (with XGBoost) on HPO-Bench.
  • regrets/regrets_identity_bore_xgboost_top0.5_nasbench for using BORE (with XGBoost) on NAS-Bench-201.

The bo_runs folder contains:

  • The BO tuning results of 2 algorithms RF and XGB on 19 small datasets with 10 replicates (bo_runs_cv).
  • The tuning results of using 4 HPO optimizers (RS, TPE, BO, BORE) on 2 benchmarks (HPO-Bench and NAS-Bench-201).

Citing

@inproceedings{makarova2022automatic,
  title={Automatic Termination for Hyperparameter Optimization},
  author={Makarova, Anastasia and Shen, Huibin and Perrone, Valerio and Klein, Aaron and Faddoul, Jean Baptiste and Krause, Andreas and Seeger, Matthias and Archambeau, Cedric},
  booktitle={First Conference on Automated Machine Learning (Main Track)},
  year={2022}
}

Security

See CONTRIBUTING for more information.

License

This project is licensed under the Apache-2.0 License.