torch lr_schedulers not accepted
TomasGadea opened this issue · 5 comments
TomasGadea commented
When instantiating the object WarmUpScheduler
and passing e.g. the torch.optim.lr_scheduler.CosineAnnealingLR
as in example.py
execution fails due to the following error:
Traceback (most recent call last):
File "test.py", line 26, in <module>
warmup_scheduler = WarmUpScheduler(optimizer, lr_scheduler,
File "/Users/tomas.gadea/Desktop/test/env/lib/python3.8/site-packages/warmup_scheduler_pytorch/warmup_module.py", line 52, in __init__
raise TypeError(f'{type(lr_scheduler).__name__} is not a lr_scheduler in pytorch')
TypeError: CosineAnnealingLR is not a lr_scheduler in pytorch
It does not work for any of the typical torch schedulers.
LEFTeyex commented
What is the version of your pytorch?
shlyahin commented
What is the version of your pytorch?
I have the same error. Torch 2.0.1
Forbu commented
@shlyahin I check the code : we need to make add "LRScheduler" in the list of attribute available here :
if not isinstance(lr_scheduler, (_LRScheduler, ReduceLROnPlateau)):
if not isinstance(lr_scheduler, (_LRScheduler, LRScheduler, ReduceLROnPlateau)):
Forbu commented
but it's for torch 2.0
LEFTeyex commented
I will update the code soon to adapt to the new version of PyTorch.