lose4578/SAM-DiffSR

How much cuda memory for inference?

Closed this issue · 2 comments

Dear developer,

Thanks for your nice project.

I use 3090 (24GB) for inference and get the error: 'CUDA out of memory'.

How much cuda memory does this project require for inference?

When you inference the DIV2K dataset(510x340) with X4 super-resolution, you need about 12G GPU memory. If you inference for larger resolution images, you need more GPU memory.

When you inference the DIV2K dataset(510x340) with X4 super-resolution, you need about 12G GPU memory. If you inference for larger resolution images, you need more GPU memory.

I use 1024x1024. 24GB menory is not enough.