Hello, do you hance a smaller version of the InterHand dataset?
zy1996829 opened this issue · 4 comments
Hello, do you have a smaller version of the InterHand dataset?It takes me 7 hours to train each epoch. I am training on 4 GPUS with a batch size of 16.
Hi, could you check this?
https://github.com/facebookresearch/InterWild#test
You can use smaller set of InterHand
Hi, could you check this? https://github.com/facebookresearch/InterWild#test You can use smaller set of InterHand
I have checked it and downloaded it. I found it's a txt file which has 121573 indexs. I want to know how to use it. Do these serial numbers correspond to the indices of the images in the dataset? thanks!
You can follow this https://github.com/facebookresearch/InterWild/blob/29f506b8bb982385c3332e22c2982d61557e3ef3/data/InterHand26M/InterHand26M.py#L52
Thanks to your advice, I successfully made the modifications and now the training time has been greatly reduced.Thank you very much.