New pre-trained model
henrycjh opened this issue · 3 comments
Hi, thanks for updating. Is the new pre-trained model named osx_l_wo_decoder.pth.tar fine-tuned on UBody? Why is it smaller than the original one?
Hi, it's the pretrained model for the setting of --encoder_setting osx_l --decoder_setting --wo_decoder
. We add it because it's more efficient and we'll add more pretrained models in the future for a better trade-off between efficiency and accuracy.
Thanks for your quick reply!
Hi, it's the pretrained model for the setting of
--encoder_setting osx_l --decoder_setting --wo_decoder
. We add it because it's more efficient and we'll add more pretrained models in the future for a better trade-off between efficiency and accuracy.
Hi, is this model much worse than the original osx_l.pth.tar
? Or is it not compatible with demo.py
? I tried to change the settign in demo.py to use this osx_l_wo_decoder.pth.tar. I only change decoder setting from normal
to --decoder_setting --wo_decoder
, but the results are completely wrong with the wo_decoder model. Just like below:
The front one is using osx_l_wo_decoder.pth.tar
and the back one is using the original osx_l.pth.tar
. Am I doing something wrong or it is normal? It is not an incident because the results of the whole sequence are all like this, completely failed.