1ytic/warp-rnnt

how to apply backward to the output of rnnt_loss

huangnengCSU opened this issue · 0 comments

Hi:
I want to use your warp-rnnt as the loss function to train my model. But i met the problem, that I don't know how to do backward. The output of rnnt_loss() is a cost and a grad, both of them are tensors. Can you give an example to show how to do backward. Thanks!
Neng