kuangliu/torchcv

Question about focal loss impementation

Opened this issue · 0 comments

` alpha = 0.25
gamma = 2

    t = one_hot_embedding(y.data.cpu(), 1+self.num_classes)
    t = t[:,1:]  # exclude background
    t = Variable(t).cuda()

    p = x.sigmoid()
    pt = p*t + (1-p)*(1-t)         # pt = p if t > 0 else 1-p
    w = alpha*t + (1-alpha)*(1-t)  # w = alpha if t > 0 else 1-alpha
    w = w * (1-pt).pow(gamma)
    return F.binary_cross_entropy_with_logits(x, t, w, size_average=False)`

w is already the value of focal loss, why there is F.binary_cross_entropy_with_logits?