Problem in TransferLearning Cifar100
MihaoYoung opened this issue · 0 comments
==> Preparing data..
Files already downloaded and verified
Files already downloaded and verified
learning rate:0.05, weight decay: 0.0005
==> Building model..
adopt performer encoder for tokens-to-token
transfer learning, load t2t-vit pretrained model
Traceback (most recent call last):
File "transfer_learning.py", line 134, in
load_for_transfer_learning(net, args.transfer_model, use_ema=True, strict=False, num_classes=args.num_classes)
File "/home/Yanghm/T2T/utils.py", line 83, in load_for_transfer_learning
state_dict = load_state_dict(checkpoint_path, use_ema, num_classes)
File "/home/Yanghm/T2T/utils.py", line 70, in load_state_dict
old_posemb = state_dict['pos_embed']
KeyError: 'pos_embed'
==> Preparing data..
Files already downloaded and verified
Files already downloaded and verified
learning rate:0.05, weight decay: 0.0005
==> Building model..
adopt performer encoder for tokens-to-token
transfer learning, load t2t-vit pretrained model
Traceback (most recent call last):
File "transfer_learning.py", line 134, in
load_for_transfer_learning(net, args.transfer_model, use_ema=True, strict=False, num_classes=args.num_classes)
File "/home/Yanghm/T2T/utils.py", line 83, in load_for_transfer_learning
state_dict = load_state_dict(checkpoint_path, use_ema, num_classes)
File "/home/Yanghm/T2T/utils.py", line 71, in load_state_dict
if model.pos_embed.shape != old_posemb.shape: # need resize the position embedding by interpolate
AttributeError: 'bool' object has no attribute 'pos_embed'
When I used the pretrained model cirfar100_t2t-vit-14_88.4.pth or 81.5_T2T_ViT_14.pth.tar, there is an Error that state_dict['pos_embed'] is loss