Typo in forward of ResNetRecurrentGateSP
Closed this issue · 1 comments
volcacius commented
There is a typo in the forward method of ResNetRecurrentGateSP (could be elsewhere as well, haven't checked) under CIFAR, grob instead of gprob. Wrong values are returned up to the top function of the training loop, but thankfully they are not used.
for g in range(3):
for i in range(0 + int(g == 0), self.num_layers[g]):
...
mask, grob = self.control(gate_feature)
gprobs.append(gprob)
...
xinw1012 commented
Thanks for pointing that out! I will update the code accordingly.
Best,
Xin
…On Fri, Sep 28, 2018 at 9:50 AM Alessandro Pappalardo < ***@***.***> wrote:
There is a typo in the forward method of ResNetRecurrentGateSP (could be
elsewhere as well, haven't checked) under CIFAR, *grob* instead of *gprob*.
Wrong values are returned up to the top function of the training loop, but
thankfully they are not used.
for g in range(3):
for i in range(0 + int(g == 0), self.num_layers[g]):
...
mask, grob = self.control(gate_feature)
gprobs.append(gprob)
...
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#4>, or mute the thread
<https://github.com/notifications/unsubscribe-auth/AGt5hMn570g0JBSCf3_eL_BhLOEoHAa2ks5uflNcgaJpZM4W-3VL>
.