graykode/nlp-tutorial

TextCNN_Torch have wrong comment

jnakor opened this issue · 3 comments

def forward(self, X): embedded_chars = self.W[X] # [batch_size, sequence_length, sequence_length]

I think the shape is [batch_size, sequence_length,embedding_size]

yes,I think so

Filed PR #49

can somebody tell me, why need three conv layer to convolve the word embedding matrix? I dont understand.