About Attention
Opened this issue · 3 comments
qinya commented
What should I do if I follow the attention layer behind the CRF layer?
ningshixian commented
What do you specifically mean?
qinya commented
I want to connect CRF behind LSTM+Attention and do NER. I tried but failed. I added CRF when calculating loss.
ningshixian commented
you can do it like this,
output = Bidirectional(LSTM())
output = AttentionDecoder()(output)
output = TimeDistributed(Dense(num_class))(output)
output = crf(output)