GPU Memory Requirements
YBYBZhang opened this issue · 6 comments
YBYBZhang commented
Thanks for your excellent work! How much memory GPU is required for text-to-video generation during inference?
Yuanshi9815 commented
I have tried it on different GPUs:
- 24G A5000 ❌
- 48G A6000 ✅
The peak memory costing is approximately 25Gb.
I guess it could work well on 24Gb GPU if you optimize the code.
YBYBZhang commented
Thanks for your detailed reply!
Yuanshi9815 commented
Update:
Moving the cond_stage_model
to cpu after get_learned_conditioning()
can make it work well on 24G card.
And the memory coasting will reduce from 23G to 7G during the inference. 🤯
bruinxiong commented
@Yuanshi9815 could you give more detailed instruction for reducing gpu memory cost ?
bruinxiong commented
@Yuanshi9815 It's done. It costs 13G memory during the inference for 384 res.
bruinxiong commented