Tiiiger/bert_score

Error while running tune_layers.py

nandinib1999 opened this issue · 4 comments

Hi, I was trying to run tune_layers.py but I am getting the error given below with the package.

0% 0/6 [00:00<?, ?it/s]Traceback (most recent call last):
  File "/content/bert_score/tune_layers/tune_layers.py", line 99, in <module>
    main()
  File "/content/bert_score/tune_layers/tune_layers.py", line 74, in main
    scores, gold_scores = get_wmt16_seg_to_bert_score(lang_pair, network, 100, idf=args.idf, cache=False)
  File "/content/bert_score/tune_layers/tune_layers.py", line 49, in get_wmt16_seg_to_bert_score
    cands, refs, model_type=model_type, num_layers=num_layers, verbose=False, idf=idf, all_layers=True
  File "/usr/local/lib/python3.6/dist-packages/bert_score/score.py", line 138, in score
    all_layers=all_layers,
  File "/usr/local/lib/python3.6/dist-packages/bert_score/utils.py", line 416, in bert_cos_score_idf
    sen_batch, model, tokenizer, idf_dict, device=device, all_layers=all_layers
  File "/usr/local/lib/python3.6/dist-packages/bert_score/utils.py", line 299, in get_bert_embedding
    model, padded_sens[i : i + batch_size], attention_mask=mask[i : i + batch_size], all_layers=all_layers
  File "/usr/local/lib/python3.6/dist-packages/bert_score/utils.py", line 211, in bert_encode
    emb = torch.stack(out[-1], dim=2)
TypeError: stack(): argument 'tensors' (position 1) must be tuple of Tensors, not Tensor
  0% 0/6 [00:26<?, ?it/s]

Can you tell me how can I solve this issue?
I am trying to run this script with a custom model trained by me. The command that I am using to run it is:
!python "/content/bert_score/tune_layers/tune_layers.py" -m "path_to_model_weights"

hi @nandinib1999 I think this code might be out of date and not compatible with the newest huggingface API.

Looking at the error right now, it seems that the returned tensor type from the hf transformer is different now.

I will try to find time to fix this today.

Hi @Tiiiger, any updates on this?

hi @nandinib1999 , should be fixed by a5d8e9d

Can you check?

Yes, it is working now. Thanks a lot!