Skipping scheduler for EquiformerV2 with OC22
sedaoturak opened this issue · 6 comments
Hello!
There was an issue regarding skipping the scheduler of Equiformer in case of OCPCalculater use (issue #541 ). Is it possible to do the same thing for EquiformerV2 model (checkpoint ) trained with OC22?
Hmm - the issue you pointed to should address any instances of EqV2. Can you share a minimal example that I can use to reproduce the error you're running into.
Sure. I came across this error when I tried to create the OCPCalculator object:
(I work on google colab and skip the other installation code snippets in the notebook)
!wget https://dl.fbaipublicfiles.com/opencatalystproject/models/2023_10/oc22/s2ef/eq2_121M_e4_f100_oc22_s2ef.pt
from ocpmodels.common.relaxation.ase_utils import OCPCalculator
# checkpoint_path = "gnoc_oc22_oc20_all_s2ef.pt"
checkpoint_path = '/content/ocp/eq2_121M_e4_f100_oc22_s2ef.pt'
ocp_calculator = OCPCalculator(checkpoint_path=checkpoint_path)
TypeError Traceback (most recent call last)
in <cell line: 5>()
3 # checkpoint_path = "gnoc_oc22_oc20_all_s2ef.pt"
4 checkpoint_path = '/content/ocp/eq2_121M_e4_f100_oc22_s2ef.pt'
----> 5 ocp_calculator = OCPCalculator(checkpoint_path=checkpoint_path)
5 frames
/content/ocp/ocpmodels/modules/scheduler.py in init(self, optimizer, config)
34 self.scheduler = getattr(lr_scheduler, self.scheduler_type)
35 scheduler_args = self.filter_kwargs(config)
---> 36 self.scheduler = self.scheduler(optimizer, **scheduler_args)
37
38 def step(self, metrics=None, epoch=None) -> None:
TypeError: LambdaLR.init() missing 1 required positional argument: 'lr_lambda'
Sorry for the delay on this, I indeed was able to reproduce this error.
I have uploaded the checkpoint at the same location, you should be able to redownload the file and try again.
Hi, I've tried again and got the same error. I've also converted the code to fairchem. I don't know if I do something wrong:
ocp_calculator = OCPCalculator(
model_name="EquiformerV2-lE4-lF100-S2EFS-OC22",
local_cache="pretrained_models",
cpu=False,
)
WARNING:root:Detected old config, converting to new format. Consider updating to avoid potential incompatibilities.
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
[<ipython-input-12-1982fd4316ff>](https://localhost:8080/#) in <cell line: 8>()
6 # ocp_calculator = OCPCalculator(checkpoint_path=checkpoint_path)
7
----> 8 ocp_calculator = OCPCalculator(
9 model_name="EquiformerV2-lE4-lF100-S2EFS-OC22",
10 local_cache="pretrained_models",
5 frames
[/usr/local/lib/python3.10/dist-packages/fairchem/core/modules/scheduler.py](https://localhost:8080/#) in __init__(self, optimizer, config)
39 self.scheduler = getattr(lr_scheduler, self.scheduler_type)
40 scheduler_args = self.filter_kwargs(config)
---> 41 self.scheduler = self.scheduler(optimizer, **scheduler_args)
42
43 def step(self, metrics=None, epoch=None) -> None:
TypeError: LambdaLR.__init__() missing 1 required positional argument: 'lr_lambda'
Can you run md5sum eq2_121M_e4_f100_oc22_s2ef.pt
on the checkpoint that downloaded. I ran your exact commands with no issue.
It seems it works. Thank you!