jaywonchung/BERT4Rec-VAE-Pytorch

BERT4rec model seems to be underfit in the reported results.

asash opened this issue · 0 comments

asash commented

Hi,
I am experimenting with the original implementation of BERT4rec, but I used the same sampling strategy as you do.

When I trained the original implementation for 1 hour I've got following results:
R@1: 0.341391
R@5: 0.661589
R@10: 0.765728
NDCG@5: 0.512567
NDCG@10: 0.546344

This is well aligned with what you reported in the table.

However, when I gave the model 16 hours to train, I got much better results:
R@1: 0.405960
R@5: 0.714570
R@10: 0.803974
NDCG@5: 0.571801
NDCG@10 0.600875

I think it worth for you to re-evaluate your model with more epochs on the ML-1M dataset

For the ML-20M dataset my results are well aligned with yours:
R@1 0.613104
R@5 0.887872
R@10 0.945339
NDCG@5 0.763871
NDCG@10 0.782702