first unzip the dataset, and move each dataset (1-8) to ./data/train
Problems:
- how to deal with attention maps when training paused and restored?
- Sigmoid and Softmax in Attention Module (Not trained)
- bn in train and valid
- (IMPORTANT) whether using softmax operation in attention module or not
- (IMPORTANT)use + or * or conv2d(kernel 1) in final attention module operation
Handle:
- Dice in binary classification
- metrics for multiclass classification (Not Debuged)
- larger Attention map
- multiple config.py (for preprocess_data.py)
- Semi-supervised model
TRAIN_RECORD for each model: Binary:
- UNet (not resized) batchsize=8, learn_rate=1e-5, epoches=20, gpu=4 (TITAN V), validation IoU=0.6728385997379875: validation Dice=0.784111333214352
- UNet11 (not resized) batchsize=8, learn_rate=1e-5, epoches=20, gpu=4 (TITAN V), validation IoU=0.8175544938356124: validation Dice=0.8885956894127561
- UNet16 (not resized) batchsize=8, learn_rate=1e-5, epoches=20, gpu=4 (TITAN V), validation IoU=: validation Dice=