lucidrains/ema-pytorch

Leaf/computed variables requires_grad error

crlandsc opened this issue · 0 comments

I am running into an error when I am trying to wrap a model in the EMA object.

Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "<@beartype(ema_pytorch.ema_pytorch.EMA.__init__) at 0x34a773310>", line 107, in __init__
  File "/Users/chris/.pyenv/versions/3.8.10/envs/wb-api/lib/python3.8/site-packages/ema_pytorch/ema_pytorch.py", line 93, in __init__
    self.ema_model.requires_grad_(False)
  File "/Users/chris/.pyenv/versions/3.8.10/envs/wb-api/lib/python3.8/site-packages/torch/nn/modules/module.py", line 2440, in requires_grad_
    p.requires_grad_(requires_grad)
RuntimeError: you can only change requires_grad flags of leaf variables. If you want to use a computed variable in a subgraph that doesn't require differentiation use var_no_grad = var.detach().

I am using pytorch lightning 2.1.0 with torch version 2.1.0. I have updated to the latest ema-pytorch (0.4.3).

I have been using this EMA implementation in my pytorch lightning code to train with other model frameworks completely fine up until this point. So I am a bit puzzled as to why this new model framework is resulting in this error when I try to couple it with EMA. Additionally, if I do not use and EMA model, this framework trains normally.

Digging deeper, it appears that all model variables do appear to be leaf variables (no computed vars), so it is hard to determine what the issue is. I wanted to know if this is a problem that others have run into and what a potential fix may be. Thanks in advance.