how does pytorch pad sentences ?
StephennFernandes opened this issue · 3 comments
hey do you happend to know how does pytorch pad sentence?
i used your implementation and tried to decode the processed input using the vocab.itos and found that padding is randomly placed between words in a given sentence.
When i used keras for padding there was an args to specify how we needed padding, first or last preferably.
i dont actually know what would be the impact of randomly padding with words in a given sentence, but i certainly feel it should be in the end.
Do let me know more about this
can I use torch.version=1.7.1,only upgrade torchtext=0.9?
yes you can, I have been using torch 1.7 with torchtext-0.9 and everything works fine for me
How are you currently padding your sequences?
The best method I have found is to have your sequences be a list of tensors and then pad using torch.nn.utils.rnn.pad_sequence
. This will always pad at the end of the sequence using the padding_value
argument (which defaults to 0).