jcjohnson/pytorch-examples

Gradient at node is with respect to what?

abhigenie92 opened this issue · 2 comments

The backward function receives the gradient of the output Tensors with respect to some scalar value.
What is this scalar value? What does it represent in the computational graph?

Each variable has their own grad after callingbackward(). So it's wrt each variable.

The call loss.backward() computes the gradient of the loss with respect to each node in the graph.