/amaes

Masked Autoencoder Pretraining on 3D Brain MRI

Primary LanguagePython

AMAES: Masked Autoencoder Pretraining on 3D Brain MRI

Official Pytorch implementation of AMAES from the paper

AMAES: Augmented Masked Autoencoder Pretraining on Public Brain MRI Data for 3D-Native Segmentation
ADSMI @ MICCAI 2024
Asbjørn Munk*, Jakob Ambsdorf*, Sebastian Llambias, Mads Nielsen

Pioneer Centre for AI & University of Copenhagen

* Equal Contribution

Efficient pretraining for 3D segmentation models using MAE and augmentation reversal on a large domain-specific dataset.

For more information on the paper see amaes.asbn.dk.

Overview results-no_arrow

Method abstract

🧠BRAINS-45K dataset

All models are pretrained on 🧠BRAINS-45K, the largest pretraining dataset available for brain MRI.

All code necesarry to reproduce the dataset will be made available as soon as possible.

Model checkpoints

All checkpoints have been pretrained on 🧠BRAINS-45K for 100 epochs using AMAES.

Model Parameters Checkpoint
M Zenodo 🤗 Kaggle
U-Net XL 90 Download
U-Net B 22 Download
MedNeXt-L 55 Download
MedNeXt-M 21 Download

All models were pretrained on 2xH100 GPUs with 80GB of memory.

Running the code

  1. Install Poetry.
  2. Create environment by calling poetry install.

Setup data

AMAES is using the Yucca library for handling 3D medical data.

Guide on how to setup data comming soon.

Pretraining

To pretrain using AMAES run

poetry run src/pretrain.py --base_path=<path to base data directory>

Finetuning

To finetune using AMAES, run

poetry run src/train.py --base_path=<path to base data directory> --pretrained_weights_path="<path_to_checkpoint>" --model=<model_to_instantiate>

Note that the checkpoint must match the model provided. For instance, to finetune unet_xl_lw_dec_fullaug.pth use --model=unet_xl.

Citation

Please use

@article{munk2024amaes,
  title={AMAES: Augmented Masked Autoencoder Pretraining on Public Brain MRI Data for 3D-Native Segmentation},
  author={Munk, Asbjørn and Ambsdorf, Jakob and Llambias, Sebastian and Nielsen, Mads},
  journal={arXiv preprint arXiv:2408.00640},
  year={2024}
}