some module have missed
zhhao1 opened this issue · 3 comments
zhhao1 commented
zhhao1 commented
Another question is that the order of modules you save is different from that saved by torch.load. l1 is the first layer, context stands for mask multihead attention, and ln stands for layer norm. Do you mean this?
jorgtied commented
Those models are created by MarianNMT and maybe you need to ask for support from there support channels or maybe I misunderstand something in your question. This is not a native torch package if that is causing some problem for you.
zhhao1 commented
Those models are created by MarianNMT and maybe you need to ask for support from there support channels or maybe I misunderstand something in your question. This is not a native torch package if that is causing some problem for you.
Thanks a lot.