Failed to evaluate codet5p-2b on multiple GPU cards
Cxm211 opened this issue · 1 comments
Cxm211 commented
Hi, I tried to evaluate codet5p-2b. I loaded the model from huggingface and I got an error saying CUDA out of memory, then I tried to load the model into multiple GPU cards by adding device_map = 'auto' when load the model. But I got another error: CodeT5pEncoderDecoderModel does not support device_map='auto' yet.
The same issue happens when I loaded my own finetuend codet5p-2b models.
yuewang-cuhk commented
Hi there, we've updated the model class and this issue should be fixed now.