UNet for segmenting salt deposits from seismic images with PyTorch.
We, tugstugi and xuyuan, have participated in the Kaggle competition TGS Salt Identification Challenge and reached the 9-th place. This repository contains a simplified and cleaned up version of our team's code partially based on the ideas of Heng Cherkeng's discussion on the Kaggle discussion board.
We have used a single UNet model with a SENet154 encoder which has a single fold score of 0.882. With 10 folds using reflective padding and another 10 folds with resizing, we got 0.890. The final private LB score 0.892 was achieved by post processing on the model's output.
- single UNet model with a Squeeze-and-Excitation network encoder
- no ensembling, no pseudo labeling
- object context in the decoders and in the base
- symmetric extension of the Lovasz hinge loss function (+0.02 private LB improvement):
def symmetric_lovasz(outputs, targets):
return (lovasz_hinge(outputs, targets) + lovasz_hinge(-outputs, 1 - targets)) / 2
- Download and extract the dataset
- copy
train.csv
intodatasets/
- copy train images and masks into
datasets/train/
- copy test images into
datasets/test/
- copy
- Train SENet154-Unet for 250 epochs on 2x P100:
python train.py --vtf --pretrained imagenet --loss-on-center --batch-size 32 --optim adamw --learning-rate 5e-4 --lr-scheduler noam --basenet senet154 --max-epochs 250 --data-fold fold0 --log-dir runs/fold0 --resume runs/fold0/checkpoints/last-checkpoint-fold0.pth
- tensorboard logs, checkpoints and models are saved under
runs/
- start tensorboard with
tensorboard --logdir runs
- training log of a LB0.883 model is provided under
runs/lb0.883_fold0/
- tensorboard logs, checkpoints and models are saved under
- Do SWA on the best loss, accuracy and kaggle metrics models:
python swa.py --input runs/fold0/models --output fold0_swa.pth
- Create a Kaggle submission:
python test.py --tta fold0_swa.pth --output-prefix fold0
- a submission file
fold0-submission.csv
should be created now
- a submission file