universal-ie/UIE

transformers报错

JieShenAI opened this issue · 1 comments


> /usr/local/lib/python3.10/dist-packages/transformers/optimization.py:306: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set `no_deprecation_warning=True` to disable this warning
  warnings.warn(
***** Running training *****
  Num examples = 1350
  Num Epochs = 10
  Instantaneous batch size per device = 16
  Total train batch size (w. parallel, distributed & accumulation) = 16
  Gradient Accumulation steps = 1
  Total optimization steps = 850
  0% 0/850 [00:00<?, ?it/s]/usr/local/lib/python3.10/dist-packages/transformers/tokenization_utils_base.py:2318: UserWarning: `max_length` is ignored when `padding`=`True` and there is no truncation strategy. To pad to max length, use `padding='max_length'`.
  warnings.warn(
  0% 1/850 [00:13<3:12:40, 13.62s/it]Traceback (most recent call last):
  File "/content/UIE/run_uie_finetune.py", line 520, in <module>
    main()
  File "/content/UIE/run_uie_finetune.py", line 436, in main
    train_result = trainer.train(resume_from_checkpoint=checkpoint)
  File "/content/UIE/uie/seq2seq/constrained_seq2seq.py", line 102, in train
    return super().train(
  File "/usr/local/lib/python3.10/dist-packages/transformers/trainer.py", line 1500, in train
    return inner_training_loop(
  File "/usr/local/lib/python3.10/dist-packages/transformers/trainer.py", line 1819, in _inner_training_loop
    self._maybe_log_save_evaluate(tr_loss, model, trial, epoch, ignore_keys_for_eval)
TypeError: ConstraintSeq2SeqTrainer._maybe_log_save_evaluate() takes 5 positional arguments but 6 were given

得使用python3.8, python3.10无法安装transformers==4.6.0
如果非要使用 python3.10,可以参考我的解决思路。
https://blog.csdn.net/sjxgghg/article/details/131565853

我使用 transformers >= 4.24.0 ,我删除了, ConstraintSeq2SeqTrainer 的 predict_step 以及 _maybe_log_save_evaluate 函数,我发现这并不会有什么影响。因为这些函数是对 transformers 原有函数的覆盖,只要没有太大的影响可以删去。