OpenMOSS/CoLLiE

张量并行流水并行可以和lora一起使用么?报错ValueError: Target module ColumnParallelLinearWithoutBias() is not supported. Currently, only `torch.nn.Linear` and `Conv1D` are supported.

Closed this issue · 3 comments

张量并行流水并行可以和lora一起使用么?报错ValueError: Target module ColumnParallelLinearWithoutBias() is not supported. Currently, only `torch.nn.Linear` and `Conv1D` are supported.

主要代码如下:

  1. 设置路径
    1.1 预训练模型路径
    pretrained_model = './Llama-2-70b-chat-hf'
    1.2 Eval的decode结果保存路径
    save_path = './result'

  2. 设置配置
    2.1 加载配置
    config = CollieConfig.from_pretrained(pretrained_model)
    2.2 添加配置
    config.tp_size = 4
    config.dp_size = 1
    config.pp_size = 1
    config.peft_config = LoraConfig(
    task_type="CAUSAL_LM",
    r=8,
    lora_alpha=32,
    target_modules=["q_proj", "v_proj"],
    lora_dropout=0.05,
    )
    model = get_peft_model(model, config.peft_config)

报错如下:
Traceback (most recent call last):
File "finetune_llama_lora.py", line 138, in
model = get_peft_model(model, config.peft_config)
File "/usr/local/lib/python3.8/dist-packages/peft/mapping.py", line 106, in get_peft_model
return MODEL_TYPE_TO_PEFT_MODEL_MAPPING[peft_config.task_type](model, peft_config, adapter_name=adapter_name)
File "/usr/local/lib/python3.8/dist-packages/peft/peft_model.py", line 889, in init
super().init(model, peft_config, adapter_name)
File "/usr/local/lib/python3.8/dist-packages/peft/peft_model.py", line 111, in init
self.base_model = PEFT_TYPE_TO_MODEL_MAPPING[peft_config.peft_type](
File "/usr/local/lib/python3.8/dist-packages/peft/tuners/lora.py", line 274, in init
super().init(model, config, adapter_name)
File "/usr/local/lib/python3.8/dist-packages/peft/tuners/tuners_utils.py", line 88, in init
self.inject_adapter(self.model, adapter_name)
File "/usr/local/lib/python3.8/dist-packages/peft/tuners/tuners_utils.py", line 219, in inject_adapter
self._create_and_replace(peft_config, adapter_name, target, target_name, parent, **optionnal_kwargs)
File "/usr/local/lib/python3.8/dist-packages/peft/tuners/lora.py", line 372, in _create_and_replace
new_module = self._create_new_module(lora_config, adapter_name, target, **kwargs)
File "/usr/local/lib/python3.8/dist-packages/peft/tuners/lora.py", line 481, in _create_new_module
raise ValueError(
ValueError: Target module ColumnParallelLinearWithoutBias() is not supported. Currently, only torch.nn.Linear and Conv1D are supported.

lora还不支持和张量并行一起使用。Same issue: #116

lora还不支持和张量并行一起使用。Same issue: #116

谢谢!