chenchao15/2D_projection_matching

memory usage

Closed this issue · 3 comments

Hi Thank you for your nice work~
I want to check with you the memory usage of the program. When I run on my 11G gpu, I can only set output point cloud to 2000, with a batch size=1. Bigger than that will raise OOM issue.
The cd loss function takes a lot memory, for batch size=1 is 4x2000x5000x2, So I cannot train any model to generate 8000 points, not to mention 16000 points.
How do you deal with this memory issue?
Thank you very much.

i meet the same problem.

You can try to reduce the 'sample_scale' value to 6000 or less in the default_config.yaml

You can try to reduce the 'sample_scale' value to 6000 or less in the default_config.yaml

Thanks. It's ok now.