nv-nguyen/gigapose

How many epochs did you train for each network?

Opened this issue · 1 comments

Hi,

Thanks for your great work!

I find that the maximum epoch you write in cfg is 1000. I think you might stop training manually. So how many epochs do we need for saturation? If you train is_net and ae_net separately, how many epochs did you train for each network?

Looking forward to your reply.

Hi,

I asked a similar question, and since I received a response from the author via email, I will pass it on to you on their behalf.
His response was as follows:

About GigaPose, I trained on 4GPUs but less than 10 hours (not reach 1 epoch). 
The checkpoint that I used for evaluation and shared in the repo is at 10.000 iterations (approximately 2-4 hours).
It seems training longer does not improve the performance but I did not try to debug/understand carefully it yet.