VinAIResearch/PhoGPT

(Google Colab) ImportError: cannot import name '_expand_mask' from 'transformers.models.bloom.modeling_bloom'

phatjkk opened this issue · 7 comments

Hi !
Thank you for your excellent work!

I got this error when trying to run the model on Google Colab. I use the example code on README file:

from transformers import AutoConfig, AutoModelForCausalLM, AutoTokenizer  
  
model_path = "vinai/PhoGPT-7B5-Instruct"  
  
config = AutoConfig.from_pretrained(model_path, trust_remote_code=True)  
config.init_device = "cuda"
# config.attn_config['attn_impl'] = 'triton' # Enable if "triton" installed!
  
model = AutoModelForCausalLM.from_pretrained(  
    model_path, config=config, torch_dtype=torch.bfloat16, trust_remote_code=True  
)
# If your GPU does not support bfloat16:
# model = AutoModelForCausalLM.from_pretrained(model_path, config=config, torch_dtype=torch.float16, trust_remote_code=True)
model.eval()  
  
tokenizer = AutoTokenizer.from_pretrained(model_path, trust_remote_code=True)  
  
PROMPT = "### Câu hỏi:\n{instruction}\n\n### Trả lời:"  
  
input_prompt = PROMPT.format_map(  
    {"instruction": "Làm thế nào để cải thiện kỹ năng quản lý thời gian?"}  
)  
  
input_ids = tokenizer(input_prompt, return_tensors="pt")  
  
outputs = model.generate(  
    inputs=input_ids["input_ids"].to("cuda"),  
    attention_mask=input_ids["attention_mask"].to("cuda"),  
    do_sample=True,  
    temperature=1.0,  
    top_k=50,  
    top_p=0.9,  
    max_new_tokens=1024,  
    eos_token_id=tokenizer.eos_token_id,  
    pad_token_id=tokenizer.pad_token_id  
)  
  
response = tokenizer.batch_decode(outputs, skip_special_tokens=True)[0]  
response = response.split("### Trả lời:")[1]

The error:
image

---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
[<ipython-input-8-2cf92868f260>](https://localhost:8080/#) in <cell line: 13>()
     11 # )
     12 # If your GPU does not support bfloat16:
---> 13 model = AutoModelForCausalLM.from_pretrained(model_path, config=config, torch_dtype=torch.float16, trust_remote_code=True)
     14 model.eval()
     15 

11 frames
[~/.cache/huggingface/modules/transformers_modules/vinai/PhoGPT-7B5-Instruct/8083375bebd52681090be6ebaf8bae7aee491f73/hf_prefixlm_converter.py](https://localhost:8080/#) in <module>
     13 import torch
     14 from transformers.models.bloom.modeling_bloom import BaseModelOutputWithPastAndCrossAttentions, BloomForCausalLM, BloomModel, CausalLMOutputWithCrossAttentions, CrossEntropyLoss
---> 15 from transformers.models.bloom.modeling_bloom import _expand_mask as _expand_mask_bloom
     16 from transformers.models.bloom.modeling_bloom import _make_causal_mask as _make_causal_mask_bloom
     17 from transformers.models.bloom.modeling_bloom import logging

ImportError: cannot import name '_expand_mask' from 'transformers.models.bloom.modeling_bloom' (/usr/local/lib/python3.10/dist-packages/transformers/models/bloom/modeling_bloom.py)

I don't know how to fix it, but I found the same issue at : https://huggingface.co/mosaicml/mpt-7b/discussions/83

Can you tell me how to run it?
Thanks you!

It's a mismatch with the latest version of transformers. You might want to use an older version of transformers (e.g., 4.33).

I will make a fix to work with the latest version of transformers soon.

@phatjkk code updated, please help check whether it works fine in your side with the latest transformers.

check

Hi, I met this problem with 4.36.0/1/2. I update my transformers to use mixtral-8x7b.

How can I solve this issue. Thank you.

@phatjkk code updated, please help check whether it works fine in your side with the latest transformers.

My older version is 4.31.0 and it was no problem.

Also have this issue with 4.36.2

Please clear your cache of PhoGPT, and re-download it.
https://huggingface.co/vinai/PhoGPT-7B5-Instruct/blob/main/hf_prefixlm_converter.py was updated without any bloom modeling.

Please clear your cache of PhoGPT, and re-download it. https://huggingface.co/vinai/PhoGPT-7B5-Instruct/blob/main/hf_prefixlm_converter.py was updated without any bloom modeling.

Thank you, I will try this.