cthoyt/class-resolver

Conflict between PyTorch and PyTorch Lightning

Closed this issue · 7 comments

Thanks for the package! It looks like there is a conflict between PyTorch and PyTorch Lightning when making use of class_resolver.contrib.torch. The following can re-produce the issue:

import torch
import pytorch_lightning
from class_resolver.contrib.torch import optimizer_resolver
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/Users/rusty1s/miniconda3/lib/python3.7/site-packages/class_resolver/contrib/torch.py", line 71, in <module>
    suffix="LR",
  File "/Users/rusty1s/miniconda3/lib/python3.7/site-packages/class_resolver/api.py", line 159, in from_subclasses
    **kwargs,
  File "/Users/rusty1s/miniconda3/lib/python3.7/site-packages/class_resolver/api.py", line 119, in __init__
    suffix=suffix,
  File "/Users/rusty1s/miniconda3/lib/python3.7/site-packages/class_resolver/base.py", line 106, in __init__
    self.register(element)
  File "/Users/rusty1s/miniconda3/lib/python3.7/site-packages/class_resolver/base.py", line 153, in register
    raise RegistrationNameConflict(self, key, element, label="name")
class_resolver.base.RegistrationNameConflict: Conflict on registration of name exponential:
Existing: <class 'pytorch_lightning.tuner.lr_finder._ExponentialLR'>
Proposed: <class 'torch.optim.lr_scheduler.ExponentialLR'>

Ooh fascinating. I'm tempted to make a simple solution that just disregards private classes (i.e., starting with _). This wouldn't solve the general issue of conflicts.

Bad hacky solution (DO NOT DO THIS) I think it might also be the case that changing the import order would alleviate this issue (since the class resolver is super magical and uses the __subclasses__() hook on the given base class, which is only aware of code that's already been imported and executed. This is obviously a bad solution because code should not rely on the import order.

Another possibility (which again doesn't solve the general issue) is to have it only register classes from the same package as the base class

Yes, changing the import order does help to alleviate this issue but is hard to guarantee in practice. I personally feel that contrib.torch should only include classes directly provided by PyTorch.

I should have prefaced that musing with "BAD SOLUTION". I just edited the text in case anyone reads this later. I think I will go about implementing both of these suggestions and make them the reasonable defaults. Thanks again for pointing this out!

Super :) Looking forward to your solution.

@rusty1s I've just pushed a fix and minted a new release (v0.3.2). You shouldn't have to change any code since the defaults will now skip both external subclasses to the originally defined module as well as private classes

Thank you :)