gaozhihan/PreDiff

How much time will it take to train the model on a single A100?(The dataset size is 800*800, but we will shrink it to 200*200)

Crestina2001 opened this issue · 1 comments

How much time will it take to train the model on a single A100?(The dataset size is 800*800, but we will shrink it to 200*200)

Thank you for your interest in our research and for asking this question. For details on our reported training times, please refer to response #12 . We have not tested training on an A100 GPU specifically, but you may infer the time cost based on our results using Tesla A10G GPUs.

The training time also depends on the degree of downsampling applied in the latent space. In our experiments on SEVIR-LR data, we downsampled the inputs by a factor of 8x8 (Downsampled the original $128\times128$ inputs to $16\times16$). The larger downsampling factor results in faster training but poorer performance.

latent_shape: [6, 16, 16, 64]