jacobkimmel/pytorch_modelsize

would it work for bottleneck layers?

ooodragon94 opened this issue · 3 comments

I'm assuming this won't work for bottleneck layers
are there alternatives for it?

"Bottleneck layer" can have many meanings, depending on the specific context.
So long as your dimensionality changes occur through modules available in model.modules() where isinstance(model, torch.nn.Module), this tool should work for you.

If you have dimensionality changes through inaccessible operations in model.forward() (e.g. x.view(y, z) or nn.functional.max_pool(x), there's no way to determine what operations you applied and the tool will fail.

oh ok
thanks for the reply

So for maxpooling for view, I will have to manually calculate it?
Or does it mean if I don't use classes that doesn't inherit from nn.modules (nn.Linear, conv2d, batchnorm) it will fail?

I'm still confused if it still works for nn.Sequential and nn.ModuleList. Are they also applicable?

Or does it mean if I don't use classes that doesn't inherit from nn.modules (nn.Linear, conv2d, batchnorm) it will fail?

Yes, you need to use classes that inherit from nn.Module and appear in model.modules().

I'm still confused if it still works for nn.Sequential and nn.ModuleList. Are they also applicable?

It should work fine for those containers.

Hope that helps!