sksq96/pytorch-summary

Double counts parameters if same later called twice.

Opened this issue · 0 comments

In the SimpleConv example given in README it show model has 20 parameters but in reality it only has 10 trainable parameters.
As the self.features is called twice it is double counting the parameters.

Try the following:
print(model)
SimpleConv(
(features): Sequential(
(0): Conv2d(1, 1, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(1): ReLU()
)
)
print(model.features[0].weight.numel(), model.features[0].bias.numel())
9 1

model

`import torch
import torch.nn as nn
from torchsummary import summary

class SimpleConv(nn.Module):
def init(self):
super(SimpleConv, self).init()
self.features = nn.Sequential(
nn.Conv2d(1, 1, kernel_size=3, stride=1, padding=1),
nn.ReLU(),
)

def forward(self, x, y):
    x1 = self.features(x)
    x2 = self.features(y)
    return x1, x2

device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
model = SimpleConv().to(device)

summary(model, [(1, 16, 16), (1, 28, 28)])`

`----------------------------------------------------------------
Layer (type) Output Shape Param #

        Conv2d-1            [-1, 1, 16, 16]              10
          ReLU-2            [-1, 1, 16, 16]               0
        Conv2d-3            [-1, 1, 28, 28]              10
          ReLU-4            [-1, 1, 28, 28]               0

================================================================
Total params: 20
Trainable params: 20
Non-trainable params: 0

Input size (MB): 0.77
Forward/backward pass size (MB): 0.02
Params size (MB): 0.00
Estimated Total Size (MB): 0.78
----------------------------------------------------------------`