summerlvsong/Aggregation-Cross-Entropy

Need Help! Loss nan

Closed this issue · 3 comments

for this line: torch.log(input)

The 'input' is the softmax score (0-1).
If k-th class does not show in an input, the accumulative softmax score of all time steps for k-th class is very likely to be 0. Then this will result into torch.log(input) = nan.

How do you make sure that 'input' does not equal to 0 for 'torch.log(input)'

NAN problem fixed. Please refer to line 34 of 'source/models/seq_module.py'.

NAN problem fixed. Please refer to line 34 of 'source/models/seq_module.py'.

Thanks!

make sure input > 0