meetps/pytorch-semseg

cross entropy reshaping lines

brucemuller opened this issue · 0 comments

Hi, thank you for your efforts!

In cross_entropy2d function in loss.py I noticed there are these reshaping lines:

input = input.transpose(1, 2).transpose(2, 3).contiguous().view(-1, c)
target = target.view(-1)

Before passing into the loss calculation. Are these for efficiency? What's the main purpose?