/Trans-SVNet_Journal

[IJCARS'22]Trans-SVNet: hybrid embedding aggregation Transformer for surgical workflow analysis, 1st Prize of Best Paper Award of IJCARS-MICCAI 2021

Primary LanguagePython

Trans-SVNet: hybrid embedding aggregation Transformer for surgical workflow analysis

by Yueming Jin, Yonghao Long, Xiaojie Gao, Danail Stoyanov, Qi Dou, Pheng-Ann Heng.

Introduction

Data

Setup & Training

  1. Check dependencies:

    - pytorch 1.0+
    - opencv-python
    - numpy
    - sklearn
    
  2. Clone this repo

    git clone https://github.com/YuemingJin/Trans-SVNet_Journal
  3. Generate labels and prepare data path information

  • Run $ generate_phase_anticipation.py to generate the label of workflow anticipation

  • Run $ get_paths_labels.py to generate the files needed for the training

  1. Training
  • Run $ train_embedding.py to train ResNet50 backbone

  • Run $ generate_LFB.py to generate spatial embeddings

  • Run $ tecno.py to train TCN for temporal modeling

  • Run $ tecno_trans.py to train Transformer

Testing

Our trained models can be downloaded from Dropbox.

  • Run $ trans_SV_output.py to generate the predictions for evaluation

We use the evaluation protocol of M2CAI challenge for evaluating our method. Please refer to TMRNet repository for evaluation script.

Note: 
We take the training&testing procedure for Cholec80 dataset (folder: ./code_80/) as an example. 
For M2CAI dataset, the same code can be used. 
For CATA dataset, code can be found in ./code_CATA/ folder and training&testing procedure are the same.

Citation

If this repository is useful for your research, please cite:

@ARTICLE{jin2022trans,  
  author={Jin, Yueming and Long, Yonghao and Gao, Xiaojie and Stoyanov, Danail and Dou, Qi and Heng, Pheng-Ann},  
  journal={International Journal of Computer Assisted Radiology and Surgery},   
  title={Trans-SVNet: hybrid embedding aggregation Transformer for surgical workflow analysis},
  volume={17},
  number={12},
  pages={2193--2202},
  year={2022},
  publisher={Springer}
}

Questions

For further question about the code or paper, please contact 'ymjin5341@gmail.com'