pBFSLab/FastCSR

About hardware and memory consumption

Closed this issue · 1 comments

Hello, first of all, thank you for your kind words. I am currently trying to reproduce this work, but my GPU has only 4GB of memory. I'm not sure if it meets the requirements. Could you please provide me with information about the memory consumption of the model in your experiments?

4GB GPU memory inference may be possible, but the training process is definitely not enough. At that time, the training used GTX 3090, which has 24GB GPU memory, and the Tesla P4 GPU was used for testing, which has 8GB GPU memory.