Need Help! Loss nan
Closed this issue · 3 comments
allen4747 commented
for this line: torch.log(input)
The 'input' is the softmax score (0-1).
If k-th class does not show in an input, the accumulative softmax score of all time steps for k-th class is very likely to be 0. Then this will result into torch.log(input) = nan.
How do you make sure that 'input' does not equal to 0 for 'torch.log(input)'
summerlvsong commented
NAN problem fixed. Please refer to line 34 of 'source/models/seq_module.py'.
allen4747 commented
NAN problem fixed. Please refer to line 34 of 'source/models/seq_module.py'.
Thanks!
TangDL commented
make sure input > 0