Automated Deep Learning (AutoDL-Projects) is an open source, lightweight, but useful project for researchers. This project implemented several neural architecture search (NAS) and hyper-parameter optimization (HPO) algorithms.
Who should consider using AutoDL-Projects
- Beginners who want to try different AutoDL algorithms
- Engineers who want to try AutoDL to investigate whether AutoDL works on your projects
- Researchers who want to easily implement and experiement new AutoDL algorithms.
Why should we use AutoDL-Projects
- Simple library dependencies
- All algorithms are in the same codebase
- Active maintenance
At the moment, this project provides the following algorithms and scripts to run them. Please see the details in the link provided in the description column.
Type | ABBRV | Algorithms | Description |
---|---|---|---|
NAS | TAS | Network Pruning via Transformable Architecture Search | NIPS-2019-TAS.md |
DARTS | DARTS: Differentiable Architecture Search | ICLR-2019-DARTS.md | |
GDAS | Searching for A Robust Neural Architecture in Four GPU Hours | CVPR-2019-GDAS.md | |
SETN | One-Shot Neural Architecture Search via Self-Evaluated Template Network | ICCV-2019-SETN.md | |
NAS-Bench-201 | NAS-Bench-201: Extending the Scope of Reproducible Neural Architecture Search | NAS-Bench-201.md | |
... | ENAS / REA / REINFORCE / BOHB | NAS-Bench-201.md | |
HPO | HPO-CG | Hyperparameter optimization with approximate gradient | coming soon |
Basic | ResNet | Deep Learning-based Image Classification | BASELINE.md |
At first, this repo is GDAS
, which is used to reproduce results in Searching for A Robust Neural Architecture in Four GPU Hours.
After that, more functions and more NAS algorithms are continuely added in this repo. After it supports more than five algorithms, it is upgraded from GDAS
to NAS-Project
.
Now, since both HPO and NAS are supported in this repo, it is upgraded from NAS-Project
to AutoDL-Projects
.
Please install Python>=3.6
and PyTorch>=1.3.0
. (You could also run this project in lower versions of Python and PyTorch, but may have bugs).
Some visualization codes may require opencv
.
CIFAR and ImageNet should be downloaded and extracted into $TORCH_HOME
.
Some methods use knowledge distillation (KD), which require pre-trained models. Please download these models from Google Drive (or train by yourself) and save into .latent-data
.
If you find that this project helps your research, please consider citing some of the following papers:
@inproceedings{dong2020nasbench201,
title = {NAS-Bench-201: Extending the Scope of Reproducible Neural Architecture Search},
author = {Dong, Xuanyi and Yang, Yi},
booktitle = {International Conference on Learning Representations (ICLR)},
url = {https://openreview.net/forum?id=HJxyZkBKDr},
year = {2020}
}
@inproceedings{dong2019tas,
title = {Network Pruning via Transformable Architecture Search},
author = {Dong, Xuanyi and Yang, Yi},
booktitle = {Neural Information Processing Systems (NeurIPS)},
year = {2019}
}
@inproceedings{dong2019one,
title = {One-Shot Neural Architecture Search via Self-Evaluated Template Network},
author = {Dong, Xuanyi and Yang, Yi},
booktitle = {Proceedings of the IEEE International Conference on Computer Vision (ICCV)},
pages = {3681--3690},
year = {2019}
}
@inproceedings{dong2019search,
title = {Searching for A Robust Neural Architecture in Four GPU Hours},
author = {Dong, Xuanyi and Yang, Yi},
booktitle = {Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
pages = {1761--1770},
year = {2019}
}
- Awesome-NAS : A curated list of neural architecture search and related resources.
- AutoML Freiburg-Hannover : A website maintained by Frank Hutter's team, containing many AutoML resources.
The entire codebase is under MIT license