Maybe a flag to Omit custom nn.Modules
dsantiago opened this issue · 0 comments
dsantiago commented
Hello, sometimes showing the Custom Module is confusing, an example:
class Convolutional(nn.Module):
def __init__(self, in, out:
super(Convolutional, self).__init__()
self.layer = nn.Sequential(
nn.Conv2d(in, out),
nn.BatchNorm2d(out),
nn.LeakyReLU(0.1, inplace=True)
)
def forward(self, x):
return self.layer(x)
Results in:
Convolutional-1 [-1, 32, 416, 416] 0
Conv2d-2 [-1, 64, 208, 208] 18,432
BatchNorm2d-3 [-1, 64, 208, 208] 128
LeakyReLU-4 [-1, 64, 208, 208] 0
That "Convolutional" Module/Layer confuses the end result, worst yet when the net gets bigger. I would expect a way to just get the Conv -> Batch -> LeakyRelu Modules/Layers.