TUM-DAML/gemnet_pytorch

Question about pretrained weights

okin1234 opened this issue · 2 comments

Hi there,

Thank you for providing the codebase.
I am using GemNet-T pertained weights, and I get better performance in the downstream task.
But I can't find how to get the GemNet-T pretrained weights.

Can I know what database and target is used in pretraining?

That's great to hear!

The published weights were pretrained on COLL, with the usual force and energy loss mixture, i.e. with the energy as a target and forces predicted via backpropagation.

@gowithdaflo Please correct me if any of this is wrong. :)

Actually the weights we published are of the model(s) that were trained on the QM7-X dataset with 3.2M samples in the trainingset. Since this dataset is significantly bigger than COLL this might be the reason why you get a better performance.