FengZhenhua/Wing-Loss

How can I reproduce result on 300-W

jimmysue opened this issue · 7 comments

I implement CNN6 with pytorch and similar PDB, the score on 300-W is round 5.8, Can you share the training code to reproduce the result?

@sjzcv Can you share your code? My implementation on tensorflow can't reach your level. And I don't think that differernt framework will lead to such difference.

@MyYaYa What's your score? Can I have your WeChat that we can discuss, this is my email:
su [dot] jz [at] outlook [dot] com

@sjzcv @MyYaYa Dear both, one of my students just re-implemented my work with Tensorflow. The performance with ResNet50 and L1 loss on AFLW is around 1.58. So I do believe that you should able to achieve similar results using other frameworks as reported in the paper. Unfortunately, the source code for training is still under the embargo period and I am not allowed to release it now due to the policy.

@FengZhenhua Can you share some implementation details on 300-W. The performance on AFLW is easier to achieve.

@FengZhenhua Can you share the details of CNN7's architecture. Are the channels number for each conv output: [64, 64, 64, 64, 128, 256, 512] ?
But I estmate the params number of arhitecture as above, it turns out to be 14.73MB, which is different with the size declaimed in the paper as 46MB

@sjzcv I send a email to you, so happy to be talking with you.

@sjzcv In fact, we do not have any special settings for 300-W. If you can get good result on AFLW, you should be able to achieve similar performance on 300-W. The only difference might be the bounding box given by the dataset. The bounding boxes of AFLW are loose and square but the ones for 300-W are rectangle and tight. What I did is to extend the short side of a rectangle to generate a square. Then I resize them by a factor of 1.2 for all the training and test images of 300-W.

For CNN7 the kernel sizes are [64, 128, 256, 512, 512, 512, 512].