GPU memory for inference
lxc86739795 opened this issue · 1 comments
lxc86739795 commented
Hey guys,
I encountered OOM during inference with V100 of 16G memory. It seems that the memory requirement is very large. Can you provide some suggestions for lower memory??
canqin001 commented
Thank you for your interest of our project. Just assign --num_samples 1
would solve this problem. It needs 9000MB to run with the batchsize of 1 on inference.