CuthbertCai/pytorch_DANN

Evaluation does not suppress dropout, which will affect performance

Opened this issue · 0 comments

Hi there, thanks for the great repo! I would like to point out that I think the dropout function should be initialized at init in the in Class_classifier, so that. Your implementation uses dropout inplace.

logits = self.fc2(F.dropout(logits))

By adding self.dropout = nn.Dropout() during the initialization, and replace this term with

logits = self.fc2(logits)
logits = self.dropout(logits)

I was able to obtain a 2~3% performance gain on the tasks using the same model checkpoint and inputs.
Could you help me verify if my understanding is correct?