This is an implmentation of our paper on Loss Agnostic and Model Agnostic Meta Neural Architecture Search (LAMANAS) using a self-supervised loss, parameterized by a neural network. Our base implemention is derived from Meta-Learning of Neural Architectures for Few-Shot Learning, located here.
Run
conda env create -f environment.yml
to create a new conda environment named metanas
with all requiered packages and activate it.
Download the data sets you want to use (Omniglot or miniImagenet). You can also set download=True
for the data loaders in torchmeta_loader.py
to use the data download provided by Torchmeta.
Please refer to the scripts
folder for examples how to use this code. E.g., for experiments on miniImagenet:
- Running meta training for MetaNAS:
run_in_meta_train.sh
- Running meta testing for a checkpoint from the above meta training experiment:
run_in_meta_testing.sh
- Scaling up an optimized architecture from above meta training experiment and retraining it:
run_in_upscaled.sh