A dangling layer in forward call?
Closed this issue · 1 comments
InnovArul commented
Just came across this implementation and noticed the usage of a non-registered (dangling) layer in InvertedResidualBlock's forward call.
When this line is executed, a new conv layer will be created every time & it won't be trained by the optimizer.
Is it for any particular usecase?
Lines 58 to 60 in 50e159d
lafith commented
@InnovArul Thank you for pointing it out. I have fixed it in the latest commit.