Dual Meta-Learning with Longitudinally Generalized Regularization for One-Shot Brain Tissue Segmentation Across the Human Lifespan
by Yongheng Sun, Fan Wang, Jun Shu, Haifeng Wang, Li Wang, Deyu Meng, Chunfeng Lian.
This repository is for our ICCV 2023 paper 'Dual Meta-Learning with Longitudinally Generalized Regularization for One-Shot Brain Tissue Segmentation Across the Human Lifespan'.
nnUNet_plan_and_preprocess -t XXX --verify_dataset_integrity
1.3 Run nnUNetv1 baseline on your data. Save your Generic_UNet, DataLoader3D, and get_moreDA_augmentation hyperparameters.
nnUNet_train 3d_fullres nnUNetTrainerV2 TASK_NAME_OR_ID FOLD (additional options)
1.4 Replace your nnUNet folder with this repository (I rewrote the files in the folders of \nnunet\network_architecture and \nnunet\training\network_training).
1.5 Import Generic_UNet, DataLoader3D, and get_moreDA_augmentation with hyperparameters in step 1.3, as in main.py.
python main.py
-
You can download the pretrained model via https://drive.google.com/file/d/1t6nCM376LBVHXjktr52k8KeDwuZZTLy2/view?usp=drive_link.
-
Process your data as required by nnUNetv1 using below command:\
nnUNet_plan_and_preprocess -t XXX --verify_dataset_integrity
-
then change nnUNetPlansv2.1_plans_3D.pkl to change patch size of input:\
python change_plans.py
-
finetune your model using below command:\
nnUNet_train 3d_fullres nnUNetTrainerV2_FT TASK_NAME_OR_ID FOLD (additional options)
The code is based on nnUNetv1 (https://github.com/MIC-DKFZ/nnUNet/tree/nnunetv1).
If you find this project useful for your research, please consider citing:
@inproceedings{sun2023dual,
title={Dual Meta-Learning with Longitudinally Consistent Regularization for One-Shot Brain Tissue Segmentation Across the Human Lifespan},
author={Sun, Yongheng and Wang, Fan and Shu, Jun and Wang, Haifeng and Wang, Li and Meng, Deyu and Lian, Chunfeng},
booktitle={Proceedings of the IEEE/CVF International Conference on Computer Vision},
pages={21118--21128},
year={2023}
}