xxlong0/CNMNet

the weight of prob_loss

Xusyy opened this issue · 1 comments

Xusyy commented

Hi! I'm confused of the parameters used when adding 'prob_loss'. Why is the weight 'alpha' set to 0.2 in the paper, but 1 in the code?
Since there is a function 'criterion_prob' in your code, did you use the prob_map_loss or other supervision information during training?

line 197 in train.py:

            prob_map_loss, prob_map_gt = criterion_prob(prob_map, idepth_refined, gt_disparity[:, 0, :, :, :])

            prob_loss = 5 * prob_loss_depth + prob_loss_minusmean  # + prob_map_loss

Thanks a lot!

In the code, we set:
prob_loss = 5 * prob_loss_depth + prob_loss_minusmean

The ratio of the two loss terms is 1/5, which is the alpha in our paper.