cszhangzhen/HGP-SL

About the acc of some Baesline

lilybud opened this issue · 9 comments

Hi~ your paper is really an amazing work!
After reading the paper, I became more convinced of this. But now i have a question about the accuracy of the baseline, SAGPool. The belonging paper of this model report the acc is lower more than in your paper, not a little. could you give me some tips about this? thank you!

Hi,

Thanks for your interest in our work.

We followed the EigenPool paper and used the random splitting of datasets into 0.8, 0.1, 0.1 for training, validation and testing. The SAGPool paper used the 10-fold validation splitting, which is more reasonable in fact. This is why the baseline results reported by us are relatively higher compared with the original paper.

ok, so your meaning is you do not change the setting of the experiments of the SAGPool Model, only choose 10-fold validation splitting. if convinience, can you pose this part's code, i am following SAGPool model, but do not have this higher accuracy.
Or send email to me ? Please~
thank for your time.

All our experiments are conducted at the random splitting of datasets into 0.8, 0.1, 0.1 for training, validation and testing; NOT the 10-fold validation splitting;

sorry~ maybe i do not understand. i remember in the original paper, the setting of the experiments (SAGPool model), also split the dataset into 0.8 0.1 0.1, what's the difference between yours and the origin?

The key difference lies at the random 10 splittings and 10-fold splittings. The basic difference is that if you use random splittings there is no guarantee to test all the samples in 10 runs. However, in 10-fold splittings, all the samples will be tested once in 10 runs.

I think the difference is very clear.

thank you very much~ i got it!
lastly~ thanks for your time~

You are welcome. If there is any question, please let me know.

Thank you.

hi~ i am coming again~ haha~
can i add your Wechat to ask you some tips for the training model~

Sure, you can leave you Wechat ID and I will add you.