brjathu/LART

configuration problem

zoeyhaiyan1 opened this issue · 2 comments

Hi, author. I want to train time configuration. Can you tell me your memory and how long you train at such configuration.

The current configuration requires about 18GB of memory and takes approximately a day to train on V100s. If you want to train it on a single node, it might take around 4 days. If time is a bottleneck, you can choose to train only with ava_train, without using kinetics_train. This approach should yield about 42 mAP in a few hours. I hope this helps! Please feel free to ask if you have any further questions.

Closing this issue, since it is inactive for a while, please feel free to reopen it if you have further questions.