menyifang/ADGAN

About the perceptual loss

Closed this issue · 1 comments

In your paper, the perceptual loss was:

# vggsubmod refers to certain layer of VGG.
Lper = L1(gram(vggsubmod(x)), gram(vggsubmod(y))  

However in your implementation:

fake_p2_norm = self.vgg_submodel(fake_p2_norm)
input_p2_norm = self.vgg_submodel(input_p2_norm)
input_p2_norm_no_grad = input_p2_norm.detach()
if self.percep_is_l1 == 1:
# use l1 for perceptual loss
loss_perceptual = F.l1_loss(fake_p2_norm, input_p2_norm_no_grad) * self.lambda_perceptual
else:
# use l2 for perceptual loss
loss_perceptual = F.mse_loss(fake_p2_norm, input_p2_norm_no_grad) * self.lambda_perceptual

Could you give some explanation on that ? thanks.

In your paper, the perceptual loss was:

# vggsubmod refers to certain layer of VGG.
Lper = L1(gram(vggsubmod(x)), gram(vggsubmod(y))  

However in your implementation:

fake_p2_norm = self.vgg_submodel(fake_p2_norm)
input_p2_norm = self.vgg_submodel(input_p2_norm)
input_p2_norm_no_grad = input_p2_norm.detach()
if self.percep_is_l1 == 1:
# use l1 for perceptual loss
loss_perceptual = F.l1_loss(fake_p2_norm, input_p2_norm_no_grad) * self.lambda_perceptual
else:
# use l2 for perceptual loss
loss_perceptual = F.mse_loss(fake_p2_norm, input_p2_norm_no_grad) * self.lambda_perceptual

Could you give some explanation on that ? thanks.

Hi @mazzzystar, thank you for your correction! Here we directly use the implement of perceptual loss from PATN and mistake it as common style loss in perceptual loss. We will correct this issue in our arxiv version.