Problem with hidden layers
lsteffenel opened this issue · 1 comments
Dear developers, I'm trying to use predrnn-pytorch but I face the following error each time I declare hidden layers mimicking an Unet architecture ("64,32,64", for example). No error happen if the hidden layers are regular ("64,64,64").
Do you have a hint on how to solve this ?
best regards.
Traceback (most recent call last):
File "run.py", line 223, in
train_wrapper(model)
File "run.py", line 191, in train_wrapper
trainer.train(model, ims, real_input_flag, args, itr)
File "/notebooks/predrnn-pytorch/core/trainer.py", line 15, in train
cost = model.train(ims, real_input_flag)
File "/notebooks/predrnn-pytorch/core/models/model_factory.py", line 42, in train
next_frames, loss = self.network(frames_tensor, mask_tensor)
File "/usr/local/lib/python3.8/dist-packages/torch/nn/modules/module.py", line 1194, in _call_impl
return forward_call(*input, **kwargs)
File "/notebooks/predrnn-pytorch/core/models/predrnn_v2.py", line 92, in forward
h_t[i], c_t[i], memory, delta_c, delta_m = self.cell_list[i](h_t[i - 1], h_t[i], c_t[i], memory)
File "/usr/local/lib/python3.8/dist-packages/torch/nn/modules/module.py", line 1194, in _call_impl
return forward_call(*input, **kwargs)
File "/notebooks/predrnn-pytorch/core/layers/SpatioTemporalLSTMCell_v2.py", line 49, in forward
m_concat = self.conv_m(m_t)
File "/usr/local/lib/python3.8/dist-packages/torch/nn/modules/module.py", line 1194, in _call_impl
return forward_call(*input, **kwargs)
File "/usr/local/lib/python3.8/dist-packages/torch/nn/modules/container.py", line 204, in forward
input = module(input)
File "/usr/local/lib/python3.8/dist-packages/torch/nn/modules/module.py", line 1194, in _call_impl
return forward_call(*input, **kwargs)
File "/usr/local/lib/python3.8/dist-packages/torch/nn/modules/conv.py", line 463, in forward
return self._conv_forward(input, self.weight, self.bias)
File "/usr/local/lib/python3.8/dist-packages/torch/nn/modules/conv.py", line 459, in _conv_forward
return F.conv2d(input, weight, bias, self.stride,
RuntimeError: Given groups=1, weight of size [96, 32, 3, 3], expected input[8, 64, 32, 32] to have 32 channels, but got 64 channels instead
Hi, sorry for this bug. The repo can only support the configuration that all the layers contain the same hidden channels. You need to change the layer module to ensure the U-Net style model.