/CAS-Transformer

This project provides a benchmar for virtual satining tasks.

Primary LanguagePython

Efficient Supervised Pretraining ofSwin-transformer for Virtual Staining of Microscopy Images

Introduction

This project is based on the following projects:

Framework of our method

framework

Usage

Requirements

  • torch == 1.12.1
  • timm == 0.6.11

Install

  • Clone this repo:
git clone https://github.com/birkhoffkiki/CAS-Transformer.git
cd WORK_DIRCTORY

Data preparation

We use three different datasets in this projects and you can download them from following addresses.

ISL dataset process:

Download data and save them to data/ISL/ directory.

cd data/scripts
python crop_patches.py --data_type test
python crop_patches.py --data_type train
python virtual_split_images.py --data_type test
python virtual_split_images.py --data_type train
# check data integrity, see check_data_integrity.py

BCI dataset

Download this dataset and unzip them to data/BCI directory.

Aperio-Hamamatsu dataset

Download this dataset and unzip them to data/Aperio directory.

Pretrain the model

# the config file is located at configs/pretrian.yaml
# you can change the parameters based on your own situations
# remeber to change the path set in the config file
bash pretrain.sh

Train the model

train the model on the ISL dataset

# the config file is located at configs/ISL/train.yaml
# you can change the parameters based on your own situations
# remeber to change the path set in the config file
bash train_isl.sh

train the model on the BCI dataset

# the config file is located at configs/BCI/train.yaml
# you can change the parameters based on your own situations
# remeber to change the path set in the config file
bash train_bci.sh

train the model on the Aperio-Hamamatsu dataset

# the config file is located at configs/AperioData/train.yaml
# you can change the parameters based on your own situations
# remeber to change the path set in the config file
bash train_aperio.sh

Evaluate and predict

evaluate ISL dataset

# attention the path of dataset
python predict_isl.py

evaluate BCI dataset

# attention the path of dataset
python predict_bci.py

evaluate Aperio dataset

# attention the path of dataset
python predict_aperio.py

Main Results

Performance on the ISL dataset

Conditions A A B B C C D D Avg Avg
PSNR SSIM PSNR SSIM PSNR SSIM PSNR SSIM PSNR SSIM
our 24.64 0.888 28.31 0.891 33.79 0.972 23.39 0.761 28.38 0.888
cross et al. 23.48 0.859 27.46 0.876 32.26 0.967 22.55 0.738 27.36 0.873
Bai et al. 23.61 0.869 26.97 0.865 31.97 0.967 22.46 0.712 27.03 0.860
Liu et al. 18.34 0.750 22.11 0.830 26.79 0.933 18.54 0.677 22.20 0.821
Eric et al. 24.67 0.886 28.10 0.870 34.62 0.967 22.56 0.708 28.32 0.868

Performance on the BCI dataset

Liu et al. Zhu et al. Isola et al. Bai et al. Our
PSNR 18.90 17.57 19.93 21.45 22.21
SSIM 0.602 0.517 0.528 0.529 0.566

Performance on the Aperio-Hamamatsu dataset

StainNet StainGAN reinhard vahadane Bai et al. Our
PSNR 22.50 22.40 22.45 21.62 24.09 24.84
SSIM 0.691 0.703 0.638 0.659 0.754 0.768

Citation

@ARTICLE{10328980, author={Ma, Jiabo and Chen, Hao}, journal={IEEE Transactions on Medical Imaging}, title={Efficient Supervised Pretraining of Swin-transformer for Virtual Staining of Microscopy Images}, year={2023}, volume={}, number={}, pages={1-1}, doi={10.1109/TMI.2023.3337253}}

contact

if you have any questions, please feel free to contact me: