RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation
hityzy1122 opened this issue · 3 comments
hityzy1122 commented
Hi thanks for your code!
I got an error as the title when call loss.backward, I fixed it when I changed self.save_for_backward(tenIn, tenFlow) into self.save_for_backward(tenIn.clone(), tenFlow.clone()), I don't know if it is right?
sniklaus commented
What is the layer before and the layer after the softmax splatting in your network?
sniklaus commented
Closing due to inactivity, please feel to reopen if this issue still persists. Thanks!
hityzy1122 commented
自动回复:邮件已收到