/TransMorph_Transformer_for_Medical_Image_Registration

TransMorph: Transformer for Unsupervised Medical Image Registration (PyTorch)

Primary LanguagePythonMIT LicenseMIT

TransMorph: Transformer for Unsupervised Medical Image Registration

arXiv

keywords: Vision Transformer, Swin Transformer, convolutional neural networks, image registration

This is a PyTorch implementation of my paper:

Chen, Junyu, et al. "TransMorph: Transformer for Unsupervised Medical Image Registration. " arXiv, 2021.

03/24/2022 - TransMorph is currently ranked 1st place on the TEST set of task03 (brain MR) @ MICCAI 2021 L2R challenge (results obtained from the Learn2Reg challenge organizers). The training scripts, dataset, and the pretrained models are available here: TransMorph on OASIS
02/03/2022 - TransMorph is currently ranked 1st place on the VALIDATION set of task03 (brain MR) @ MICCAI 2021 L2R challenge.
12/29/2021 - Our preprocessed IXI dataset and the pre-trained models are now publicly available! Check out this page for more information: TransMorph on IXI

TransMorph DIR Variants:

There are four TransMorph variants: TransMorph, TransMorph-diff, TransMorph-bspl, and TransMorph-Bayes.
Training and inference scripts are in TransMorph/, and the models are contained in TransMorph/model/.

  1. TransMorph: A hybrid Transformer-ConvNet network for image registration.
  2. TransMorph-diff: A probabilistic TransMorph that ensures a diffeomorphism.
  3. TransMorph-bspl: A B-spline TransMorph that ensures a diffeomorphism.
  4. TransMorph-Bayes: A Bayesian uncerntainty TransMorph that produces registration uncertainty estimate.

TransMorph Affine Model:

The scripts for TransMorph affine model are in TransMorph_affine/ folder.

04/27/2022 - We provided a Jupyter notebook for training and testing TransMorph-affine here. The pre-trained weights can be downloaded here along with two sample data here.

train_xxx.py and infer_xxx.py are the training and inference scripts for TransMorph models.

Loss Functions:

TransMorph supports both mono- and multi-modal registration. We provided the following loss functions for image similarity measurements (the links will take you directly to the code):

  1. Mean squared error (MSE)
  2. Normalized cross correlation (NCC)
  3. Structural similarity index (SSIM)
  4. Mutual information (MI)
  5. Local mutual information (LMI)
  6. Modality independent neighbourhood descriptor with self-similarity context (MIND-SSC)

and the following deformation regularizers:

  1. Diffusion
  2. L1
  3. Anisotropic diffusion
  4. Bending energy

Baseline Models:

We compared TransMorph with eight baseline registration methods + four Transformer architectures.
The links will take you to their official repositories.

Baseline registration methods:
Training and inference scripts are in Baseline_registration_models/

  1. SyN/ANTsPy (Official Website)
  2. NiftyReg (Official Website)
  3. LDDMM (Official Website)
  4. deedsBCV (Official Website)
  5. VoxelMorph-1 & -2 (Official Website)
  6. CycleMorph (Official Website)
  7. MIDIR (Official Website)

Baseline Transformer architectures:
Training and inference scripts are in Baseline_Transformers/

  1. PVT (Official Website)
  2. nnFormer (Official Website)
  3. CoTr (Official Website)
  4. ViT-V-Net (Official Website)

JHU Brain MRI & Duke CT Dataset:

Due to restrictions, we cannot distribute our brain MRI and CT data. However, several brain MRI datasets are publicly available online: ADNI, OASIS, ABIDE, etc. Note that those datasets may not contain labels (segmentation). To generate labels, you can use FreeSurfer, which is an open-source software for normalizing brain MRI images. Here are some useful commands in FreeSurfer: Brain MRI preprocessing and subcortical segmentation using FreeSurfer.

You may find our preprocessed IXI dataset in the next section.

Reproducible Results on IXI Dataset:

You may find the preprocessed IXI dataset, the pre-trained baseline and TransMorph models, and the training and inference scripts for IXI dataset here 👉 TransMorph on IXI

Citation:

If you find this code is useful in your research, please consider to cite:

@article{chen2021transmorph,
title={TransMorph: Transformer for unsupervised medical image registration},
author={Chen, Junyu and Frey, Eric C and He, Yufan and Segars, William P and Li, Ye and Du, Yong},
journal={arXiv preprint arXiv:2111.10480},
year={2021}
}

TransMorph Architecture:

Example Results:

Qualitative comparisons:

Uncertainty Estimate by TransMorph-Bayes:

Quantitative Results:

Inter-patient Brain MRI:

XCAT-to-CT:

Reference:

Swin Transformer
easyreg
MIDIR
VoxelMorph

About Me