/lamanas

Loss and Model Agnostic Meta Neural Architecture Search

Primary LanguagePythonGNU Affero General Public License v3.0AGPL-3.0

LAMANAS: Loss Agnostic and Model Agnostic Meta Neural Architecture Search for Few-Shot Learning

This is an implmentation of our paper on Loss Agnostic and Model Agnostic Meta Neural Architecture Search (LAMANAS) using a self-supervised loss, parameterized by a neural network. Our base implemention is derived from Meta-Learning of Neural Architectures for Few-Shot Learning, located here.

Requirements and Setup

Install requiered packages.

Run

conda env create -f environment.yml

to create a new conda environment named metanas with all requiered packages and activate it.

Download the data

Download the data sets you want to use (Omniglot or miniImagenet). You can also set download=True for the data loaders in torchmeta_loader.py to use the data download provided by Torchmeta.

How to Use

Please refer to the scripts folder for examples how to use this code. E.g., for experiments on miniImagenet: