seq2seq(attention) have wrong comment
nomorecoke opened this issue · 0 comments
nomorecoke commented
context = tf.matmul(attn_weights, enc_outputs)
dec_output = tf.squeeze(dec_output, 0) # [1, n_step]
context = tf.squeeze(context, 1) # [1, n_hidden]
I think dec_output shape is [1,n_hidden]