load state dict error
valencebond opened this issue · 3 comments
When conducting experiments of stage 1 training, state dicts of self.G model is inconsistent with the pretrained model in 'http://vllab.ucmerced.edu/ytsai/CVPR18/DeepLab_resnet_pretrained_init-f81d91e8.pth'. For example, key of state dict of self.G is 'layer5.conv2d_list.0.0.weight' and 'layer6.bottleneck.0.se.0.weight', but key of the pretrained model is 'layer5.conv2d_list.0.weight' and there is no module 'layer6.bottleneck.0.se.0.weight'. Should I set the strict as False in load_state_dict()?
Besides, due to the "Scale" in the key of pretrained model weight, code need to change to discard the "Scale". I am curious, no one has ever encountered such a problem?
I do not meet such problem before.
It seems to be the model change from UC Merced. @valencebond
Could you try the https://github.com/wasidennis/AdaptSegNet and check the model weight?
Actually, I modified this repo from the AdaptSegNet (published by UC Merced ) and the weights are also from UC Merced villa.
Sorry ,it was my fault. Due to network reasons, I downloaded the HTTP file and saved it local. So the logic of if-else changes.