YerevaNN/R-NET-in-Keras

sigmoid is missing in QuestionAttnGRU.py

keithhans opened this issue · 5 comments

According to equation 6 in the paper, there should be a sigmoid on K.dot(GRU_inputs, W_g1).

Completely agree with you. Will be fixed soon.

Just noticed it is also missing in SelfAttnGRU.py

We also noticed that.
Sigmoid disappeared after we switched from using Dense layers (with sigmoid activation) to SharedWeight layers.
Now I'm trying to repeat training process. We hope can get better scores after fixing this issue.

Pull request is merged.
Not closing the issue until we compare new scores.
@MartinXPN please rerun the instruction steps from README.md

After repeating all the steps in readme the performance didn't change much. Accuracy reached up to 60% (a bit less)