I trained myself, but I can't achieve the effect of the paper
empty2enrich opened this issue · 7 comments
Are there any tricks here?
My result:
MAE: 5.0
Pitch: 5.8
Yaw: 4.9
Roll: 4.4
trained on the 300W-LP dataset., test on the ALFW2000
A MAE of 5.0 seems to be a long way off. Check your data, its pre processing and the training configuration.
Rel: #14
I don't understand very well, I used the source code for training without modification. Why do I need to check the data?
thank you for your reply!
I updated the default training settings. This should help you to achieve the reported results more easily (or in my trainings even surpass them). Let me know if you still struggle after using them.
I used the adjusted training settings, added an image of the masked face and re-trained it and got about the same accuracy as in the paper. Thank you.
I also merged 6DRepNet and WHENet to generate a new model that allows 360° estimation. 👍
https://github.com/PINTO0309/DMHead
Yaw: 3.3193, Pitch: 4.9063, Roll: 3.3687, MAE: 3.8648
@PINTO0309 how did u trained light 6d repnet model? Currently it's slow and big
https://github.com/PINTO0309/DMHead/releases
I have not published the .pth files because I spent quite a bit of money on NVIDIA A100 for personal hobby. ONNX is publicly available, so feel free to use it.