A questions about seq2seq-torch.py in the 43th line
MowanHu opened this issue · 4 comments
Hi, Im an nlp rookie, I want to ask u a question, your code extract input(context) in a fixed window in 43th area, and "word sequence" is a sentences list , some words may extract their neighbour words form different sentences, so, is this way harm to the result?
And my training result seems not very well and I didn't change the codes.
If u see this issues, please answer me in your free time.
Although my english is poor, I still want to express my gratitude to u.
Hello. It will be more helpful for me if you add code line link such as https://github.com/graykode/nlp-tutorial/blob/master/4-2.Seq2Seq(Attention)/Seq2Seq(Attention)-Torch.py#L10
Hello. It will be more helpful for me if you add code line link such as https://github.com/graykode/nlp-tutorial/blob/master/4-2.Seq2Seq(Attention)/Seq2Seq(Attention)-Torch.py#L10
Sorry, i made a wrong title. The question is from https://github.com/graykode/nlp-tutorial/blob/master/1-2.Word2Vec/Word2Vec-Skipgram-Torch(Softmax).py#L44
And "word sequence" is a sentences list , some words may extract their neighbour words form different sentences, so, is this way harm to the result?
In my view, ngram should operate in the same sentence, not different sentences.
Yes you are right but I dont care in this example.
Yes you are right but I dont care in this example.
Thank u, I know u means.