Dropout on output?
Borororo opened this issue · 3 comments
Thanks for your nice work!
I have a question about dropout on the output layer. In the paper, you saied you used dropout 0.4 on both embedding layer and output layer.
Howeve, after reading your code carefully, I find that:
In trian.py: You defined dropout_rate = 0
, and dropout_output = dropout_rnn_output = 0.4
However, in layers.py: You only use dropout_output as statements and use **dropout_rate = 0 ** for dropout.
For example:
if self.dropout_output and self.dropout_rate > 0: output = F.dropout(output, p=self.dropout_rate, training=self.training)
I think this code will not apply dropout on output since the dropout_rate is zero, right?
Did I miss anything?
Oops, you are correct! This is an unintended bug, fixed in 371b651
I am rerunning the experiments. Hopefully, the results will become a little bit better.
Thank you!
I want to let you know that the accuracy on development set stays at ~83% after rerunning the experiments.
Thank you again.
Thanks for reply.