Upgrade VLLM dependency to support transformers 4.54.0+ (AIMv2 registration conflict)
Closed this issue · 3 comments
ExpressGradient commented
Problem
ART currently uses VLLM 0.9.1, which conflicts with transformers 4.54.0+ due to duplicate model registration for "aimv2". This prevents using ART with the latest transformers version.
Error
Full traceback:
INFO 07-27 06:35:51 [init.py:244] Automatically detected platform cuda.
Traceback (most recent call last):
File "/home/ubuntu/atreya/main.py", line 37, in <module>
asyncio.run(train())
File "/home/ubuntu/atreya/.venv/lib/python3.12/site-packages/nest_asyncio.py", line 30, in run
return loop.run_until_complete(task)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/atreya/.venv/lib/python3.12/site-packages/nest_asyncio.py", line 98, in run_until_complete
return f.result()
^^^^^^^^^^
File "/home/ubuntu/.local/share/uv/python/cpython-3.12.11-linux-x86_64-gnu/lib/python3.12/asyncio/futures.py", line 202, in result
raise self._exception.with_traceback(self._exception_tb)
File "/home/ubuntu/.local/share/uv/python/cpython-3.12.11-linux-x86_64-gnu/lib/python3.12/asyncio/tasks.py", line 314, in step_run_and_handle_result
result = coro.send(None)
^^^^^^^^^^^^^^^
File "/home/ubuntu/atreya/main.py", line 14, in train
await model.register(backend)
File "/home/ubuntu/atreya/.venv/lib/python3.12/site-packages/art/model.py", line 307, in register
base_url, api_key = await backend._prepare_backend_for_training(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/atreya/.venv/lib/python3.12/site-packages/art/local/backend.py", line 250, in preparebackend_for_training
service = await self._get_service(model)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/atreya/.venv/lib/python3.12/site-packages/art/local/backend.py", line 113, in getservice
from ..torchtune.service import TorchtuneService
File "/home/ubuntu/atreya/.venv/lib/python3.12/site-packages/art/torchtune/service.py", line 14, in <module>
from vllm import AsyncEngineArgs
File "/home/ubuntu/atreya/.venv/lib/python3.12/site-packages/vllm/init.py", line 13, in <module>
from vllm.engine.arg_utils import AsyncEngineArgs, EngineArgs
File "/home/ubuntu/atreya/.venv/lib/python3.12/site-packages/vllm/engine/arg_utils.py", line 22, in <module>
from vllm.config import (BlockSize, CacheConfig, CacheDType, CompilationConfig,
File "/home/ubuntu/atreya/.venv/lib/python3.12/site-packages/vllm/config.py", line 43, in <module>
from vllm.transformers_utils.config import (
File "/home/ubuntu/atreya/.venv/lib/python3.12/site-packages/vllm/transformers_utils/config.py", line 33, in <module>
from vllm.transformers_utils.configs import (ChatGLMConfig, Cohere2Config,
File "/home/ubuntu/atreya/.venv/lib/python3.12/site-packages/vllm/transformers_utils/configs/init__.py", line 28, in <module>
from vllm.transformers_utils.configs.ovis import OvisConfig
File "/home/ubuntu/atreya/.venv/lib/python3.12/site-packages/vllm/transformers_utils/configs/ovis.py", line 76, in <module>
AutoConfig.register("aimv2", AIMv2Config)
File "/home/ubuntu/atreya/.venv/lib/python3.12/site-packages/transformers/models/auto/configuration_auto.py", line 1306, in register
CONFIG_MAPPING.register(model_type, config, exist_ok=exist_ok)
File "/home/ubuntu/atreya/.venv/lib/python3.12/site-packages/transformers/models/auto/configuration_auto.py", line 993, in register
raise ValueError(f"'{key}' is already used by a Transformers config, pick another name.")
ValueError: 'aimv2' is already used by a Transformers config, pick another name.
Root Cause
- Transformers 4.54.0 officially added AIMv2 model support
- VLLM 0.9.1 tries to register "aimv2" itself, causing a conflict
Current Workaround
Downgrading to transformers<4.54.0 works.
Proposed Solution
Upgrade VLLM dependency to a version that's compatible with transformers 4.54.0+.
Environment
- OS: Ubuntu
- Python: 3.12.11
- ART version: 0.4.4
- Current VLLM: 0.9.1
- Transformers: 4.53.0
Additional Context
This affects anyone trying to use ART with the latest transformers features and new model architectures introduced in recent releases.
zieen commented
any progress
corbt commented
In the latest version 0.4.5 of this library we're now up to vllm 0.10.0, does that solve your problem?
ExpressGradient commented
closing the issue as its solved by upgrading art to latest version