Possible bug - parameter updating
dumitrescustefan opened this issue · 1 comments
dumitrescustefan commented
It's possible that due to changes in Dynet the default parameter for .expr() is now update=False, which freezes updates on them, so no learning is performed. Correction would be to explicitly set .expr(update=True).
tiberiu44 commented
This is confirmed. Just checked on a small training set. The accuracy increased with 4%.