Missing keys and unexpected keys
Closed this issue · 3 comments
Traceback (most recent call last):
File "run_classifier_TABSA.py", line 456, in
main()
File "run_classifier_TABSA.py", line 346, in main
model.bert.load_state_dict(torch.load(args.init_checkpoint, map_location='cpu'))
File "/home/tmuser/anaconda2/envs/virtualEnvAbhra/lib/python3.6/site-packages/torch/nn/modules/module.py", line 777, in load_state_dict
self.class.name, "\n\t".join(error_msgs)))
RuntimeError: Error(s) in loading state_dict for BertModel:
Missing key(s) in state_dict: "embeddings.word_embeddings.weight", "embeddings.position_embeddings.weight", "embeddings.token_type_embeddings.weight", "embeddings.LayerNorm.gamma", "embeddings.LayerNorm.beta", "encoder.layer.0.attention.self.query.weight", "encoder.layer.0.attention.self.query.bias"....
Unexpected key(s) in state_dict: "bert.embeddings.word_embeddings.weight", "bert.embeddings.position_embeddings.weight", "bert.embeddings.token_type_embeddings.weight", "bert.embeddings.LayerNorm.weight", "bert.embeddings.LayerNorm.bias", "bert.encoder.layer.0.attention.self.query.weight", "bert.encoder.layer.0.attention.self.query.bias", "bert.encoder.layer.0.attention.self.key.weight", "bert.encoder.layer.0.attention.self.key.bias", "bert.encoder.layer.0.attention.self.value.weight", "bert.encoder.layer.0.attention.self.value.bias", "bert.encoder.layer.0.attention.output.dense.weight"....
What shoould I do for this?
Make sure that you have used convert_tf_checkpoint_to_pytorch.py
given by us to convert the model.
I used the provided python script to convert the model only, but still getting this error.
Can you tell me which files I should use for --tf_checkpoint_path and --bert_config_file ?
Use the command given in README.md. The file name is the same as that in the command, and you just need to change the path to yours.