Why the bias at FPN upsamle conv is 'True'?
Yishun99 opened this issue · 2 comments
Yishun99 commented
globalNet.py
def _upsample(self):
layers = []
layers.append(torch.nn.Upsample(scale_factor=2, mode='bilinear', align_corners=True))
layers.append(torch.nn.Conv2d(256, 256,
kernel_size=1, stride=1, bias=True))
layers.append(nn.BatchNorm2d(256))
return nn.Sequential(*layers)
GengDavid commented
Sorry to tell that it is a known bug in my code. It's supposed to be False
.
However, since the influence seems little, I forgot to modify it.
Yishun99 commented
Thanks!