/TransMorph_Transformer_for_Medical_Image_Registration

TransMorph: Transformer for Unsupervised Medical Image Registration (PyTorch)

Primary LanguagePythonMIT LicenseMIT

TransMorph: Transformer for Unsupervised Medical Image Registration

arXiv

keywords: Vision Transformer, Swin Transformer, convolutional neural networks, image registration

This is a PyTorch implementation of my paper:

Chen, Junyu, et al. "TransMorph: Transformer for Unsupervised Medical Image Registration. " Transformer for unsupervised medical image registration,” Medical Image Analysis, p. 102615, 2022.

Here's the errata (fixing several typos in paper).

09/03/2022 - TransMorph paper has been accepted for publication in Medical Image Analysis! Some changes will follow, according to reviewers' comments.
03/24/2022 - TransMorph is currently ranked 1st place on the TEST set of task03 (brain MR) @ MICCAI 2021 L2R challenge (results obtained from the Learn2Reg challenge organizers). The training scripts, dataset, and the pretrained models are available here: TransMorph on OASIS
02/03/2022 - TransMorph is currently ranked 1st place on the VALIDATION set of task03 (brain MR) @ MICCAI 2021 L2R challenge.
12/29/2021 - Our preprocessed IXI dataset and the pre-trained models are now publicly available! Check out this page for more information: TransMorph on IXI

TransMorph DIR Variants:

There are four TransMorph variants: TransMorph, TransMorph-diff, TransMorph-bspl, and TransMorph-Bayes.
Training and inference scripts are in TransMorph/, and the models are contained in TransMorph/model/.

  1. TransMorph: A hybrid Transformer-ConvNet network for image registration.
  2. TransMorph-diff: A probabilistic TransMorph that ensures a diffeomorphism.
  3. TransMorph-bspl: A B-spline TransMorph that ensures a diffeomorphism.
  4. TransMorph-Bayes: A Bayesian uncerntainty TransMorph that produces registration uncertainty estimate.

TransMorph Affine Model:

The scripts for TransMorph affine model are in TransMorph_affine/ folder.

04/27/2022 - We provided a Jupyter notebook for training and testing TransMorph-affine here. The pre-trained weights can be downloaded here along with two sample data here.

train_xxx.py and infer_xxx.py are the training and inference scripts for TransMorph models.

Loss Functions:

TransMorph supports both mono- and multi-modal registration. We provided the following loss functions for image similarity measurements (the links will take you directly to the code):

  1. Mean squared error (MSE)
  2. Normalized cross correlation (NCC)
  3. Structural similarity index (SSIM)
  4. Mutual information (MI)
  5. Local mutual information (LMI)
  6. Modality independent neighbourhood descriptor with self-similarity context (MIND-SSC)

and the following deformation regularizers:

  1. Diffusion
  2. L1
  3. Anisotropic diffusion
  4. Bending energy

Baseline Models:

We compared TransMorph with eight baseline registration methods + four Transformer architectures.
The links will take you to their official repositories.

Baseline registration methods:
Training and inference scripts are in Baseline_registration_models/

  1. SyN/ANTsPy (Official Website)
  2. NiftyReg (Official Website)
  3. LDDMM (Official Website)
  4. deedsBCV (Official Website)
  5. VoxelMorph-1 & -2 (Official Website)
  6. CycleMorph (Official Website)
  7. MIDIR (Official Website)

Baseline Transformer architectures:
Training and inference scripts are in Baseline_Transformers/

  1. PVT (Official Website)
  2. nnFormer (Official Website)
  3. CoTr (Official Website)
  4. ViT-V-Net (Official Website)

JHU Brain MRI & Duke CT Dataset:

Due to restrictions, we cannot distribute our brain MRI and CT data. However, several brain MRI datasets are publicly available online: ADNI, OASIS, ABIDE, etc. Note that those datasets may not contain labels (segmentation). To generate labels, you can use FreeSurfer, which is an open-source software for normalizing brain MRI images. Here are some useful commands in FreeSurfer: Brain MRI preprocessing and subcortical segmentation using FreeSurfer.

You may find our preprocessed IXI dataset in the next section.

Reproducible Results on IXI Dataset:

You may find the preprocessed IXI dataset, the pre-trained baseline and TransMorph models, and the training and inference scripts for IXI dataset here 👉 TransMorph on IXI

Reproducible Results on OASIS Dataset:

You may find the preprocessed OASIS dataset, the pre-trained baseline and TransMorph models, and the training and inference scripts for OASIS dataset here 👉 TransMorph on OASIS

Citation:

If you find this code is useful in your research, please consider to cite:

@article{chen2022transmorph,
title = {TransMorph: Transformer for unsupervised medical image registration},
journal = {Medical Image Analysis},
pages = {102615},
year = {2022},
issn = {1361-8415},
doi = {https://doi.org/10.1016/j.media.2022.102615},
url = {https://www.sciencedirect.com/science/article/pii/S1361841522002432},
author = {Junyu Chen and Eric C. Frey and Yufan He and William P. Segars and Ye Li and Yong Du}
}

TransMorph Architecture:

Example Results:

Qualitative comparisons:

Uncertainty Estimate by TransMorph-Bayes:

Quantitative Results:

Inter-patient Brain MRI:

XCAT-to-CT:

Reference:

Swin Transformer
easyreg
MIDIR
VoxelMorph