is the loss_conf right in loss function?
Opened this issue · 2 comments
Hsintao commented
Thanks for your work.
i have a question.
loss_conf = .. + 0.5 * self.bce_loss(conf*noobj_mask,0)
why there is a zero?
shouldn't it beloss_conf = .. + 0.5 * self.bce_loss(conf*noobj_mask,tconf*noobj_mask)
?
@BobLiu20
And for classification, why dont use a CrossEntropyLoss?
beckhamchen commented
The same doubt with you
zhaoyang10 commented
The tconf of noobj is 0