Reproduction Issue
wayer96 opened this issue · 4 comments
Hi, I'm quite interested in your work ''Low-Dimensional Hyperbolic Knowledge Graph Embeddings'' and tried to reproduce the experimental results strictly according to the Best hyperparameters table you posted in paper. However, my results on YAGO3-10 are always about 5%-6% lower than the reported. I wonder if there are other hyper-parameters needs to adjust(like --drop_out or --reg)? Thanks a lot!
@ines-chami could you help answer this question related to hyper-parameter setting? Thanks!
Hi,
The numbers reported in the paper for YAGO were obtained using the PyTorch implementation (https://github.com/HazyResearch/KGEmb). The best hyperparameters can be found in this folder: https://github.com/HazyResearch/KGEmb/tree/master/examples. We noticed that performance slightly varied from one implementation to another and that may explain the 5% difference with TensorFlow.
Thanks,
Ines
Thanks for the response, Ines!
Closing this issue for now, but it would be good if Ines you can help run hyper-parameter sweep for TensorFlow implementation.
Thanks! I have used the PyTorch implementation, and find the best hyperparameters settings in folder examples. However, for YAGO3-10, there is only hyperparameter setting of RotH for 32dim. Looking forward a more complete parameter settings for YAGO3-10 in the future, thanks!