declare-lab/Multimodal-Infomax

Gradient disappearance occurs during training

Xinghetayue opened this issue · 4 comments

The following error occurs when executing "'python main.py --dataset mosei --contrast". It seems that there is a problem during the training process that causes NaN values to appear in the gradient calculation. I would like to know how to solve it.
1

Hi, could you please check if you were running with the torch version in 'requirement.txt'?

Hi, could you please check if you were running with the torch version in 'requirement.txt'?

Thanks for the reply. My torch version is 1.7.0, and the version in "environment.yml" is 1.7.1. Could this be the reason for the error?

I am not sure but you can have a try.
The fundamental problem is that 'torch.logdet' is not a very stable function.
You may also check if set. other hyperparameters same value as in the paper.

I am not sure but you can have a try. The fundamental problem is that 'torch.logdet' is not a very stable function. You may also check if set. other hyperparameters same value as in the paper.

ok,i will try