Experimental details
Demirrr opened this issue · 4 comments
Hello,
Thank you for the great work and making the implementation publicly available [1]. I do enjoy reading findings of your work. I was wondering whether you could share your opionon on the following matters:
1- Is there any plan to share pretrained models ?
2- Could you elaborate on runtimes ?
3- What are the ranges for hyperparameters of models ?
4- How many numbers of unique hyperparameter configurations are tested ?
I fail to find enough information to repeat experiments as Section A.3 reports only the number of epochs.
Cheers
[1] Low-Dimensional Hyperbolic Knowledge Graph Embeddings https://arxiv.org/pdf/2005.00545.pdf
@chamii22
@ines-chami
for more experiment-related details and information : )
Cheers @DualityGap :)
Hi @Demirrr
- Not at the moment but the experiments should be reproducible with the code.
- I do not have exact runtimes but it training was taking a few hours. YAGO took around a day to train.
- & 4) We mostly searched over the optimizer, negative sampling + learning rate. The different combinations were:
- Adam + negative sampling + lr in [5e-4, 1e-3, 5e-3]
- Adagrad + no negative samples + lr [5e-2, 1e-1]
Batch size was searched over [100, 250, 500, 1000, 2000, 4000].
Note: we did not try all possible combinations and only ran experiments for hyper-parameter configurations that seemed promising using the validation set.
All the best,
Ines Chami
Closing this issue following the detailed response above. Thanks Ines!