/NeuronML

🌈Official code for "Exploring Flexible Structure in Meta-Learning"

Primary LanguagePython

Awesome Neuromodulated Meta-Learning

Awesome Static Badge Static Badge Static Badge pv Repo Clones Stars

Official code for "Exploring Flexible Structure in Meta-Learning" PDF

🥇🌈This repository contains not only the CODE of our NeuronML but also several self-make Application cases of Neuroscience.

Note: The CODE is the Pytorch version of MAML (original Tensorflow version is CODE-maml)

Create Environment

For easier use and to avoid any conflicts with existing Python setup, it is recommended to use virtualenv to work in a virtual environment. Now, let's start:

Step 1: Install virtualenv

pip install --upgrade virtualenv

Step 2: Create a virtual environment, activate it:

virtualenv venv
source venv/bin/activate

Step 3: Install the requirements in requirements.txt.

pip install -r requirements.txt

Data Availability

All data sets used in this work are open source. The download and deployment ways are as follows: ​

  • miniImageNet, Omniglot, and tieredImageNet will be downloaded automatically upon runnning the scripts, with the help of pytorch-meta.

  • For 'meta-dataset', follow the following steps: Download ILSVRC2012 (by creating an account here and downloading ILSVRC2012.tar) and Cu_birds2012 (downloading from http://www.vision.caltech.edu/visipedia-data/CUB-200-2011/CUB_200_2011.tgz) separately. Then, Run sbatch scripts/download_meta_dataset/install_meta_dataset_parallel.sh to download and prune all datasets in a parallel fashion. All the ten datasets should be copied in a single directory.

  • For the few-shot-regression setting, Sinusoid, Sinusoid & Line, and Harmonic dataset are toy examples and require no downloads. Just follow the implementation in the paper.

  • For the reinforcement learning environment: Khazad Dum and MuJoCo

Now, you have completed all the settings, just directly train and test as you want :)

Train

We offer two ways to run our code (Take MAML with meta-dataset as an example), *which will be provided after a few days (until the arxiv been open-sourced)

Citation

If you find our work and codes useful, please consider citing our paper and star our repository (🥰🎉Thanks!!!):

@misc{wang2024neuromodulatedmetalearning,
      title={Neuromodulated Meta-Learning}, 
      author={Jingyao Wang and Huijie Guo and Wenwen Qiang and Jiangmeng Li and Changwen Zheng and Hui Xiong and Gang Hua},
      year={2024},
      eprint={2411.06746},
      archivePrefix={arXiv},
      primaryClass={cs.LG},
      url={https://arxiv.org/abs/2411.06746}, 
}