Gradient disappearance occurs during training
Xinghetayue opened this issue · 4 comments
Hi, could you please check if you were running with the torch version in 'requirement.txt'?
Hi, could you please check if you were running with the torch version in 'requirement.txt'?
Thanks for the reply. My torch version is 1.7.0, and the version in "environment.yml" is 1.7.1. Could this be the reason for the error?
I am not sure but you can have a try.
The fundamental problem is that 'torch.logdet' is not a very stable function.
You may also check if set. other hyperparameters same value as in the paper.
I am not sure but you can have a try. The fundamental problem is that 'torch.logdet' is not a very stable function. You may also check if set. other hyperparameters same value as in the paper.
ok,i will try