DeNA/Chainer_Realtime_Multi-Person_Pose_Estimation

Slow training speed

Opened this issue · 3 comments

Hi @leetenki, thanks for the implementation

I am trying to train the network from scratch using the COCO dataset using a Tesla K40m with the default parameters. However the training speed seems to be rather slow, around 0.17iters/sec. At this rate it would take several weeks to reach the 440000 iterations.

Is this training speed considered normal? Thank you!

Hi, we changed code. now I think you can train the model faster.

same problem! Is it possible to train on multiple gpus?

It's showing 0.28 iter/s on a 1080 GTX with an ETA of 12 days. What is the typical training setup?