foolwood/SiamMask

mask_iou_mean、mask_iou_at_5、mask_iou_at_7=0?

Opened this issue · 8 comments

Hello, I would like to ask why these indicators mask_iou_mean, mask_iou_at_5, mask_iou_at_7=0 during training?
I only used the ytb_vos dataset for training, please answer,Thank you!

I just met the same problem recently, and i find the problem is in the siammask.py/siammask_sharp.py iou_measure. In iou_measure, mask_sum = pred.eq(1).add(label.eq(1)) which can not > 2. And in the next line, intxn = torch.sum(mask_sum==2,dim=1).float will equal to 0.
I think it may be the version of pytorch. The pytorch I use is pytorch.1.5.1. But in the official code, the version of pytorch is 0.4.0. Hope helpful

Thank you! I got it!I have solved it! Thanks again

@StarrySky-SHT HI, have you encountered the following problem? when I train the train_siammask_refine function. get WARNING:root:NaN or Inf found in input tensor

3

I just met the problem today morning ,and it confuse me for a long time. but I think I have gotten it now. I just turn down the lr in the experiments/siammask_sharp/config.json. I turn down the lr to start_lr = 0.001, end_lr = 0.00025. It seems the problem gone but I do not know if it is actually right.

I just met the problem today morning ,and it confuse me for a long time. but I think I have gotten it now. I just turn down the lr in the experiments/siammask_sharp/config.json. I turn down the lr to start_lr = 0.001, end_lr = 0.00025. It seems the problem gone but I do not know if it is actually right.

ok ! thank you! I try it. Thanks again

@StarrySky-SHT Hi! have you encountered the following problem? when I train train_siammask_refine ValueRrror: loaded state dict has a different number of parameter groups!!!
image

I just met the problem today morning ,and it confuse me for a long time. but I think I have gotten it now. I just turn down the lr in the experiments/siammask_sharp/config.json. I turn down the lr to start_lr = 0.001, end_lr = 0.00025. It seems the problem gone but I do not know if it is actually right.

Thanks a lot!

The type of mask causes this error since the mask in PyTorch 0.4.X is an integer type, but in the higher PyTorch version is a boolean type. You can modify line 179 in models/siammask.py and line 183 in models/siammask_sharp.py.

mask_sum = pred.eq(1).int().add(label.eq(1).int())