The code does not match the description of the paper
USTC-Keyanjie opened this issue · 4 comments
Thanks for the awesome contribution!
When I read the code, I found that the code did not match the paper.
According to the description of your paper, I think we should change
out = self.relu(self.bn1(self.conv1(x)))
out = torch.cat((out, side_input), 1)
to
x = self.relu(self.bn1(self.conv1(x)))
out = torch.cat((x, side_input), 1)
Looking forward to your reply!
Thanks for the awesome contribution!
When I read the code, I found that the code did not match the paper.CSPN/cspn_pytorch/models/torch_resnet_cspn_nyu.py Line 269 in 24eff12 out = self.relu(self.bn1(self.conv1(x)))
According to the description of your paper, I think we should change
out = self.relu(self.bn1(self.conv1(x)))
out = torch.cat((out, side_input), 1)to
x = self.relu(self.bn1(self.conv1(x)))
out = torch.cat((x, side_input), 1)Looking forward to your reply!
I wonder is there any difference for the variable
out
in two versions?
The out indeed has no difference in two version. However, x has difference.
According to the description of the paper, x must pass the conv, bn and relu operations before entering the shortcut section.
In my opinion, the two version looks likeand
You're right
Thanks for your attention and detailed review.
Yes, this is a mistake, I will update it later.
Thanks