zczcwh/PoseFormer

Cost of Training

Opened this issue · 3 comments

Excuse me. I utilize two 3090 to do training and follow your experient setting.I notice that each GPU occupies about 4G capacities and each epoch costs 60 minutes.I don't know it's normal or not.

Hello, have you solved this problem? Each epoch costs lots of time,do you know why?

Seems just like mine. Under provided commands, my 2080S got 8GB memory which costs about 60mins a epoch, I think change the setting about learning rate or others could make it go faster.

Same as me. I used 8 2080Ti to train f=81 and batch size=1024 as the original code sets, it takes about 50 mins an epoch.