the input dim problem?
Closed this issue · 1 comments
LiangXu123 commented
def forward(self, x):
b, c, t, d1, d2 = x.size()
x = x.view(b*t, c, d1, d2)
x = self.conv2d(x)
dr1, dr2 = x.size(2), x.size(3)
x = x.view(b*dr1*dr2, x.size(1), t)
x = self.conv1d(x)
x = x.view(b, self.out_channels, -1, dr1, dr2)
return x
here should not we permute the dim first?
b, c, t, d1, d2 = x.size()
+++x=x.permute(0,2,1,3,4)
x = x.view(b*t, c, d1, d2)
otherwise,the meaning of x is not the conv2d needed like:[batchsize,channel,height,width]
proceduralia commented
I updated the code.
Thank you! 😄