adobe/NLP-Cube

Possible bug - parameter updating

dumitrescustefan opened this issue · 1 comments

It's possible that due to changes in Dynet the default parameter for .expr() is now update=False, which freezes updates on them, so no learning is performed. Correction would be to explicitly set .expr(update=True).

This is confirmed. Just checked on a small training set. The accuracy increased with 4%.