declare-lab/flan-alpaca

unable to use new flan-alpaca-gpt4-xl in pipeline

andreynikk opened this issue · 2 comments

Hi,
I've tried to use the new model, but get the following error:
ValueError: Could not load model declare-lab/flan-alpaca-gpt4-xl with any of the following classes: (<class 'transformers.models.auto.modeling_auto.AutoModelForSeq2SeqLM'>, <class 'transformers.models.t5.modeling_t5.T5ForConditionalGeneration'>).

code to reproduce:
`from transformers import pipeline

model = pipeline(model="declare-lab/flan-alpaca-gpt4-xl")`

thank you!

Thanks for raising this issue, the error may be due to the safe tensor checkpoint which requires pip install safetensors
To fix this for most users, I have also uploaded the conventional pytorch_model.bin files, could you please try again?

It works now, thanks!
I did install safetensors, so not sure what fixed it.