Could you provide more training details on Waymo dataset?
0w0h0y opened this issue · 4 comments
Since the Waymo dataset is huge, it seems to impossible to preload all training samples into memory and save as one file. So I tried to save each tracklet info into a file respectively. However, loading data seems still too slow during training with batch size 64. I wonder how many epoches and how long training time is needed to reproduce the results in the paper.
Thanks for your replying!
We trained M^2-Track on Waymo using 2 A100-80G gpus without data preloading, and the batch size for each GPU is set to 512. It took around 5 epochs to converge.
Thanks. And how much is preload_offset set for Waymo? This matters the data loading speed.
I just noticed that the preload_offset
in the provided config file in this repo is mistakenly set to 60. It is actually set to 10 in all our experiments (our default setting).
Thanks.