pretrained weight
Opened this issue · 0 comments
guozhi72 commented
I tried to compare swin_base_patch4_window7_224.pth and synapse_pretrain.model you website mentioned. But I could not find the correspondences.
For example,
model_down.layers.0.blocks.0.attn.proj.weight torch.Size([192, 192]) in synapse_pretrain.model not existed at all at swin transfromer I found.
Can you tell where your get pretrained swin-transformer? If you can explain more that would be highly appreciated. ( I understand you get attn and fcn weight from swin transformer, and assign to your pretrained model)