GengDavid/pytorch-cpn

Why the bias at FPN upsamle conv is 'True'?

Yishun99 opened this issue · 2 comments

globalNet.py

    def _upsample(self):
        layers = []
        layers.append(torch.nn.Upsample(scale_factor=2, mode='bilinear', align_corners=True))
        layers.append(torch.nn.Conv2d(256, 256,
            kernel_size=1, stride=1, bias=True))
        layers.append(nn.BatchNorm2d(256))

        return nn.Sequential(*layers)

Sorry to tell that it is a known bug in my code. It's supposed to be False.
However, since the influence seems little, I forgot to modify it.

Thanks!