graykode/nlp-tutorial

A question about transformer

goodnessSZW opened this issue · 4 comments

A question about transformer

Hey body, I've read the codes of Transformer. That's cool. Here's something I can't understand about the input of decoder. It can be acceptable that we use ' S i want a beer ' as the decoder_input in the training period. However, in the test period, the decoder_input should start with an 'S' and then we use the predicted result of 'S' which passed through decoder as the next input of decoder instead of using the whole translated sentence as the decoder_input. Because you can't use the translated sentence in any part of the model except the last part of comparsion in test, predicting period.
That's what I understand and I have no idea I'm right or wrong since I've seen that the parameters of the forward function of class Transformer include 'dec_inputs'. If I'm right, another function to predict the translated sentences is better to be created. What do you think?

Did you mean role of Best-First-Search Decoder?

Yep,just as you coded in Transformer(Greedy_decoder), I ignored it...haha..embarrassed. So, another question, greedy decoder is what we used in real project training and test or just test?

Please search about difference between Teacher forcing and Non-Teacher forcing. It'll help you.

It does not matter greedy or not on training, but using teacher forcing make to be more fastly converaged.