Function FusedLeakyReLUFunctionBackward returned an invalid gradient at index 1 - got [6] but expected shape compatible with [512]
visa980911 opened this issue · 0 comments
I really like your work. I've encountered a problem using your network for a different field, but during the training process, I encountered the following error:
Traceback (most recent call last): File "scripts/train.py", line 63, in main(opts) File "scripts/train.py", line 52, in main loss.backward() File "/root/autodl-tmp/conda/envs/Del/lib/python3.8/site-packages/torch/tensor.py", line 221, in backward torch.autograd.backward(self, gradient, retain_graph, create_graph) File "/root/autodl-tmp/conda/envs/Del/lib/python3.8/site-packages/torch/autograd/init.py", line 130, in backward Variable._execution_engine.run_backward( RuntimeError: Function FusedLeakyReLUFunctionBackward returned an invalid gradient at index 1 - got [6] but expected shape compatible with [512]
i use the same network constructure, forward is okay, but backward encountered this issue.
Is there any solution to this problem? thx to anyones help.