lafith/Mobile-UNet

A dangling layer in forward call?

Closed this issue · 1 comments

Just came across this implementation and noticed the usage of a non-registered (dangling) layer in InvertedResidualBlock's forward call.
When this line is executed, a new conv layer will be created every time & it won't be trained by the optimizer.
Is it for any particular usecase?

Mobile-UNet/model.py

Lines 58 to 60 in 50e159d

if self.in_c != self.out_c:
x = nn.Conv2d(self.in_c, self.out_c, 1, 1, 0, bias=False)(x)
return x+out

@InnovArul Thank you for pointing it out. I have fixed it in the latest commit.