mrqa/MRQA-Shared-Task-2019

How can I load pretrained model?

dlwlgus53 opened this issue · 1 comments

After I download the pretrained model, I fond'config.json, weights.th and vocabulary folder. However,

tokenizer = BertModel.from_pretrained('bert-base-uncased')
model = BertModel.from_pretrained("bert-base-uncased")
device = torch.device(f'cuda:{args.gpu_number}' if torch.cuda.is_available() else 'cpu')
torch.cuda.set_device(device) # change allocation of current GPU

print("\n\nuse trained model")
model.load_state_dict(torch.load('weights.th',map_location='cuda:0'))
print("no error!")

this doesn't work. How can I load pre-trained model??

Hi, I rewrote the script to train of every MRQA dataset which is compatible with any type of transformer model on any hardware. https://github.com/lucadiliello/mrqa-lightning