/AG-Net

Code for "Learning Where To Look – Generative NAS is Surprisingly Efficient"

Primary LanguagePythonMIT LicenseMIT

Learning Where To Look - Generative NAS is Surprisingly Efficient [PDF]

Jovita Lukasik, Steffen Jung, Margret Keuper

Generative Model using Latent Space Optimization

  • Sample-Efficient: We propose a simple model, that learns to focus on promising regions of the architecture space. It can thus learn to generate high-scoring architectures from only few queries.
  • Novel generative design: We learn architecture representation spaces via a novel generative design that is able to generate architectures stochastically while being trained with a simple reconstruction loss.
  • SOTA: Our model allows sample-efficient search and achieves state-of-the-art results on several NAS benchmarks as well as on ImageNet. It allows joint optimization w.r.t. hardware properties in a straight forward way

Installation

Clone this repo and install requirements:

pip install -r requirements.txt

Also needed:

Usage

Preliminary

Define directory path in Settings.py

Generation

bash scripts/Train_G_NB101.sh
bash scripts/Train_G_NB201.sh
bash scripts/Train_G_NBNLP.sh
bash scripts/Train_G_NB301.sh

To train the generator model in the NAS-Bench-301 search space first run datasets/NASBench301/create_random_data.py to generate 500 k random data. The pretrained genation model state dicts are in state_dicts\

Search using AG-Net on CIFAR

bash scripts/Search_NB101.sh 
bash scripts/Search_NB201.sh 
bash scripts/Search_NB301.sh 
bash scripts/Search_NBNLP.sh 
bash scripts/Search_HW.sh 

Search on ImageNet

Follow TENAS for initial steps and architecture evaluations

bash scripts/Search_TENAS.sh

Search using XGB

bash scripts/Search_NB101_XGB_XGBranking.sh

Citation

@article{lukasik2022,
  author    = {Jovita Lukasik and
               Steffen Jung and
               Margret Keuper},
  title     = {Learning Where To Look - Generative {NAS} is Surprisingly Efficient},
  journal   = {CoRR},
  volume    = {abs/2203.08734},
  year      = {2022},
}

Acknowledgement

Code base from