amirgholami/PyHessian

RuntimeError: derivative for grid_sampler_2d_backward is not implemented

krishnateja95 opened this issue · 1 comments

Thanks for making PyHessian public. I am trying to find Eigenvalues for a Neural Net that I'm implementing. I set require_grad = True for the weight variables for which I want to calculate the Eigenvalues. I am getting the following error:

RuntimeError: derivative for grid_sampler_2d_backward is not implemented

I was able to calculate first order gradients easily. I am unable to calculate Hv which is at:

hv = torch.autograd.grad(gradsH,
params,
grad_outputs=v,
only_inputs=True,
retain_graph=True)

Could you let me know what the problem could be ?

It seems your module grid_sampler_2d_backward does not support second-order backprop.