TextCNN_Torch have wrong comment
jnakor opened this issue · 3 comments
jnakor commented
def forward(self, X): embedded_chars = self.W[X] # [batch_size, sequence_length, sequence_length]
I think the shape is [batch_size, sequence_length,embedding_size]
endeavor11 commented
yes,I think so
Yuhuishishishi commented
Filed PR #49
AgaigetS commented
can somebody tell me, why need three conv layer to convolve the word embedding matrix? I dont understand.