Is there anything wrong with the d1 checkpoint?
Closed this issue · 3 comments
GMN23362 commented
sadransh commented
Sorry i don't have acces to our checkpoints backup, as soon as I get access I will update you.
All our test reports exist in the paper. If some specific test does not exist there, it means we did not train, as our access to gpu was limited
sadransh commented
Thanks for mentioning this, the link to that checkpoint was incorrect. Just updated the link.
About your other question, theoretically, the precision for those (not-performed) experiments is somewhere between smaller models and larger models.
At the time we trained model D7 we had only access to 32GB GPUs for a limited time, so probably if they are trained for longer on 40GB cards; it will reach slightly better F1/mAPs.