ezyang/pytorch-unattached

Give a better error message when num_derivatives is too low

Opened this issue · 0 comments

At the moment, if you set num_derivatives to something lower than the actual number of derivatives you compute, you'll get something like:

----------------------------------------------------------------------
Traceback (most recent call last):
  File "test/test_jit.py", line 539, in test_mini_wlm
    z.sum().backward()
  File "/data/users/ezyang/onnx-pytorch/pytorch/torch/autograd/variable.py", line 158, in backward
    torch.autograd.backward(self, gradient, retain_graph, create_graph, retain_variables)
  File "/data/users/ezyang/onnx-pytorch/pytorch/torch/autograd/__init__.py", line 98, in backward
    variables, grad_variables, retain_graph)
RuntimeError: vector::_M_range_check