QwenLM/Qwen-VL

[BUG] 在lora训练时出现 “Could not find a config file in xx”

BigworldNebula opened this issue · 2 comments

是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?

  • 我已经搜索过已有的issues和讨论 | I have searched the existing issues / discussions

该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?

  • 我已经搜索过FAQ | I have searched FAQ

当前行为 | Current Behavior

运行 lora脚本后,在每轮训练中出现以下内容:
site-packages/peft/utils/save_and_load.py:195: UserWarning: Could not find a config file in /home/xx/huggingface/Qwen-VL - will assume that the vocabulary was not modified.

期望行为 | Expected Behavior

No response

复现方法 | Steps To Reproduce

在对从hf下载下来的config文件查看时没有发现关于 vocabulary 内容的config

运行环境 | Environment

- OS: Ubuntu 18.04
- Python:3.8
- Transformers: 4.32.0
- PyTorch: 1.13.1+cu117
- peft: 0.11.0
- CUDA 11.7

备注 | Anything else?

No response

我也遇到了这个问题,使用手动从modelscope 下载的模型会出现这个问题,但是使用让代码自己下载的就没问题了。
image

请问这个Warning: Could not find a config file in /home/xx/huggingface/Qwen-VL - will assume that the vocabulary was not modified.会对结果产生影响吗?毕竟他只是一个Warning
另外vocabulary是不是就是按照assume的一样确实not modified呢?
谢谢! @1180300419 @BigworldNebula