sunshineatnoon/PytorchWCT

Content covariance matrix formula different from the author?

NCTUyoung opened this issue · 4 comments

I have find something different about the way you calculate content covariance matrix

In Torch implementation
contentCov = torch.mm(contentFeature1, contentFeature1:t()):div(sg[2]*sg[3]-1)
However, in Pytorch or Tensorflow version
contentConv = torch.mm(cF,cF.t()).div(cFSize[1]-1) + torch.eye(cFSize[0]).double()

We add one in each diagonal term , why should we do that?
It seem that it is better to do this , because it provide a better image compared to none-add version.
It would be great if you can help me figure out this question.

Wow, you caught this. This is actually for WCT with multi-masks, I haven't included that code yet. But you don't need it for WCT in this repository. The main reason is that there seems some bugs in torch.diag, if I use masks sometimes k_s will become small, then torch.diag gives a invalid argument error, haven't figured out why. But for now, this identity matrix prevents this error.

I don't know why Tensorflow code has this, you may ask the author.

Close it now, feel free to reopen

Hi! I'm the author of the TensorFlow WCT.

I'll second the good catch @NCTUyoung. That's there because I ran into a maddening bug with tf.svd() that NaN-ed the singular values for about 1% of style images I tried. Adding the ones fixes this without seeming to affect the results. The NumPy WCT in the same file does not have this instability and so omits the hack.

And congrats @sunshineatnoon for the nice PyTorch implementation.

Thank you all you guys!! @eridgd @sunshineatnoon
Actually,I am trying single-level stylization just for fun,since i don't have a good GPU support.
When it become mutilayer ,things get complicated and some numerical problem may occur.
Fortunately,i avoid the bugs by using numpy for matrix jobs.

By the way,calculate the covariance matrix first and then extract its svd decomposition will increase error.Directly compute X svd decomposition might be a good thought.