Recommended GPUs
usuyama opened this issue · 2 comments
What are recommended/required GPUs for running the experiments?
Especially, curious about the GPU memory requirements.
Thank you for this awesome work! @Richarizardd
Hi @usuyama - thank you for the comment!
For DINO pretraining - the GPU requirements follow similar requirements as in the DINO paper. Practically, the GPU requirements for the default hyper-parameters for training ViT-B (batch size = 1024, mixed precision) would need 8xA100 80GB GPUs for training 128 images per GPU (takes up 50GB RAM per GPU). For ViT-S, I am not sure what the requirements are, but memory footprint is much smaller, so one could maybe get away with 4xA100 80GB GPUs with a batch size of 256 images per GPU. If you look at Figure 9 in the DINO paper, you could potentially get away with even smaller batch sizes and it would perform pretty well.
For weakly-supervised learning, one should be able to run experiments using a conventional workstation with NVIDIA 2080/3090 TIs. Preparing pre-extracted features with CLAM (using the default batch size of 256 for reading images from openslide) needs only 6-10 GB RAM. Since all the features are pre-extracted and models are relatively lightweight, running MIL baselines would need 3-6 GB RAM (with a batch size of 1, memory requirement dependent on architecture).
Thank you! @Richarizardd
This info is really useful.