genforce/sefa

Some questions about your code

BurnWan opened this issue · 3 comments

sefa/utils.py

Line 185 in c60c536

weight = weight.flip(2, 3).permute(1, 0, 2, 3).flatten(1)

In this line, I wonder why you used torch.flip to flip the weight. I think there is no need to flip the weights before implementing convolution in CNN.

That is because we use convolution to replace the fully-connected layer in the official PGGAN.

That is because we use convolution to replace the fully-connected layer in the official PGGAN.

I see that, I mean you use torch.conv2d(x,weight), the result should be y = x[0]w[0]+x[1]w[1]+x[2]w[2]+x[3]w[3], instead of y=x[0]w[3]+x[1]w[2]+x[2]w[1]+x[3]w[0]. In this case, it is unnecessary to use the flip function.

x = F.conv2d(x,

You are correct. We just want to make sure to get the same factorized results as using the official model.