Training time for FB15k-237
Closed this issue · 1 comments
Jason-Mason commented
Hello, thank you for the work.
I am trying to train the model on the FB15k-237 dataset with the default hyperparameter setting and 2080Ti GPU.
However, it is estimated that training takes approximately 5 hours per epoch.
Can you provide a training time for when you did the experiment and verify if my situation is reasonable?
yzhangee commented
Yes, it generally takes several hours. FB15k-237 is denser than the other datasets, resulting in a much larger subgraph size. Considering that 2080Ti is a bit old, 5 hours per epoch makes sense.
If you have a larger GPU device, like A100, the batch size and testing batch size can be set larger to achieve faster running.