salesforce/CodeT5

Error when loading T5ForConditionalGeneration from HuggingFace Transformers

Closed this issue · 0 comments

I encountered an ImportError when trying to load a pretrained model using the HuggingFace Transformers library. Here's the exact error message I received:
ImportError: /opt/conda/lib/python3.8/site-packages/fused_layer_norm_cuda.cpython-38-x86_64-linux-gnu.so: undefined symbol: _ZN2at4_ops19empty_memory_format4callEN3c108ArrayRefIlEENS2_8optionalINS2_10ScalarTypeEEENS5_INS2_6LayoutEEENS5_INS2_6DeviceEEENS5_IbEENS5_INS2_12MemoryFormatEEE
Reproducible Code:

from transformers import T5ForConditionalGeneration, AutoTokenizer
checkpoint = "Salesforce/codet5p-220m"
device = "cuda"
tokenizer = AutoTokenizer.from_pretrained(checkpoint)
model = T5ForConditionalGeneration.from_pretrained(checkpoint).to(device)

Environment Details:

Python version: 3.8
Transformers version: 4.31.0
PyTorch version: 1.13.0+cu117
CUDA version: 11.7