GPU resources required for inference
Opened this issue · 1 comments
jmwang0117 commented
Hello, thank you for your excellent work.
Does text-to-multi-view inference(demo.py) require 4x A6000 GPU to complete?
I'm using a 3090 GPU and inference occurs with CUDA out of memory.
Tangshitao commented
Can you try fp16? You can add fp=16 here