podgorskiy/ALAE

Why does the input require gradients?

vmelement opened this issue · 3 comments

In https://github.com/podgorskiy/ALAE/blob/master/train_alae.py#L293 we are setting the input tensor to require gradients. Why is this? Shouldn't the input data tensor be untouched by optimization?

I'm also very confused about this line. Who can explain it?

I think it's because of the R1 gradient penalty.
The use of torch.autograd.grad forces you to set the inputs with gradients.

In

real_grads = torch.autograd.grad(real_loss, reals, create_graph=True, retain_graph=True)[0]

"reals" are the inputs images which have to have gradients in order for torch.autograd.grad to comoute the gradient of the loss with respect to the real images

@ariel415el is absolutely correct, thank you!

Closing the issue