ewrfcas/MVSFormer

about the GPU using

xingchen2022 opened this issue · 1 comments

Thanks for your excellent work!
I tried to retrain your model on 2 Rtx8000 without changing any hyper-parameters(batchsize=8), and found it used about 80G GPU memory. Science you trained on 2 V100 GPU with 64G memory, I wonder how much memory cost on your device and why it had more than 10G difference?
image

it will be appreciated if you could give me some explanation, thank you so much!

We have tested these codes on 48G GPUs (A6000). For 32G GPU, we have to reduce the sub-batchsize slightly.
Besides, the displayed memory cost would differ for different GPU types (such as RTX8000 vs V100).