/UDUN

Official Implementation of ACM-MM 2023 paper "Unite-Divide-Unite: Joint Boosting Trunk and Structure for High-accuracy Dichotomous Image Segmentation"

Primary LanguagePython

UDUN (ACM-MM2023)

UDUN

Official Implementation of ACM-MM 2023 paper "Unite-Divide-Unite: Joint Boosting Trunk and Structure for High-accuracy Dichotomous Image Segmentation"

Jialun Pei, Zhangjun Zhou, Yueming Jin, He Tang, and Pheng-Ann Heng

[Paper]; [Official Version]

Contact: jialunpei@cuhk.edu.hk, hetang@hust.edu.cn

Environment preparation

Requirements

  • Linux with python ≥ 3.8
  • Pytorch ≥ 1.7 and torchvison that matches the Pytorch installation.
  • Opencv
  • Numpy
  • Apex

Dataset preparation

Download the datasets and annotation files

Register datasets

  1. Download the datasets and put them in the same folder. To match the folder name in the dataset mappers, you'd better not change the folder names, its structure may be:
    DATASET_ROOT/
    ├── DIS5K
       ├── DIS-TR
          ├── im
          ├── gt
          ├── trunk-origin
          ├── struct-origin
       ├── DIS-VD
          ├── im
          ├── gt
       ├── DIS-TE1
          ├── im
          ├── gt
       ├── DIS-TE2
          ├── im
          ├── gt
       ├── DIS-TE3
          ├── im
          ├── gt
       ├── DIS-TE4
          ├── im
          ├── gt

Pre-trained models 🔥

  • Download the pre-training weights into UDUN-master/pre .
Model UDUN pretrain weights $F_\omega$ $\uparrow$ MAE $\downarrow$ HCE $\downarrow$
ResNet-18 UDUN-R18 0.807 0.065 1009
ResNet-34 UDUN-R34 0.818 0.060 999
ResNet-50 UDUN-R50 0.831 0.057 977

Download the optimized model weights and store them in UDUN-master/model.

Visualization results ⚡

The visual results of our UDUN with ResNet-50 trained on Overall DIS-TE.

  • Results on the Overall DIS-TE test set: Quark/ Google

The visual results of other SOTAs on Overall DIS-TE.

  • Results on the Overall DIS-TE test set: Quark/ Google

Usage

Train&Test

  • To train our UDUN on single GPU by following command,the trained models will be saved in savePath folder. You can modify datapath if you want to run your own datases.
./train.sh
  • To test our UDUN on DIS5K, the prediction maps will be saved in DIS5K_Pre folder.
python3 test.py 

Evaluation

  • To Evaluate the predicted results.
cd metrics
python3 test_metrics.py 
python3 hce_metric_main.py

Get trunk map and struct map

  • Split the ground truth into trunk map and struct map, which will be saved into DIS5K-TR/gt/Trunk-origin and DIS5K-TR/gt/struct-origin.
cd utils
python3 utils.py

Acknowledgement

This work is based on:

Thanks for their great work!

Citation

If this helps you, please cite this work:

@inproceedings{pei2023udun,
  title={Unite-Divide-Unite: Joint Boosting Trunk and Structure for High-accuracy Dichotomous Image Segmentation},
  author={Pei, Jialun and Zhou, Zhangjun and Jin, Yueming and Tang, He and Pheng-Ann, Heng},
  booktitle={Proceedings of the 31th ACM International Conference on Multimedia},
  pages={1--10},
  year={2023},
}