prismformore/Multi-Task-Transformer

epoch 

Closed this issue · 1 comments

Hello, thank you for your contribution, I wonder if you need to really train 999999 epoch in training?

@187185537 No. That "epochs" value is not used to control the training schedule. We set the max iteration step here:
https://github.com/prismformore/InvPT/blob/3b70fcc5a4f7053a7e32a9f85da5dda670c18fba/configs/pascal/pascal_vitLp16.yml#L17