eric-yyjau/pytorch-superpoint

CUDA out of memory on 3090

FeiXie8 opened this issue · 4 comments

when i export lables on coco2014 and set the resolution to 480*640,CUDA out of memory.
my gpu is rtx3090,and the memory is 24GB

Hi, @FeiXie8 .
Maybe, i think the batchsize is too many when you train the model.
Probably, batchsize will be 8 better.

when i export lables on coco2014 and set the resolution to 480*640,CUDA out of memory. my gpu is rtx3090,and the memory is 24GB

hello,are you solve this question?I meet the same problem.

Hi, @FeiXie8 . Maybe, i think the batchsize is too many when you train the model. Probably, batchsize will be 8 better.

where can i change batchsize?

export.py not batch size reason.

Maybe you need set num_workers = 0, and pin_memory = False

workers_test = training_params.get('workers_test', 0) # 16
test_loader = torch.utils.data.DataLoader(
test_set, batch_size=1, shuffle=False,
#pin_memory=True,
num_workers=workers_test,
#worker_init_fn=worker_init_fn

in loader.py line 138