baaivision/EVA

CUDA out of memory when inference

Opened this issue · 0 comments

I am trying to evaluate the EVA model https://huggingface.co/BAAI/EVA/blob/main/eva_coco_seg.pth in COCO instance segementation. I have only GTX 2080 and it always is out of memory even after I have set image_size to 128 in /path/to/EVA/det/projects/ViTDet/configs/common/coco_loader_lsj_1536.py. If I did not mistake the config, it can't be helped.