[Feature Request] Make torchopt.optim.Optimizer compatible with pytorch lightning
SamDuffield opened this issue · 0 comments
SamDuffield commented
Required prerequisites
- I have searched the Issue Tracker and Discussions that this hasn't already been reported. (+1 or comment there if it has.)
- Consider asking first in a Discussion.
Motivation
Currently torchopt.optim classes aren't compatible with lightning's configure_optimizers
.
This is because lightning doesn't think they are Optimizable
import torchopt
from lightning.fabric.utilities.types import Optimizable
optimizer = torchopt.Adam(model.parameters(), lr=1e-3)
isinstance(optimizer, Optimizable)
# False
For it to be Optimizable it requires defaults
and state
attributes.
If simply you do
from collections import defaultdict
optimizer.defaults = {}
optimizer.state = defaultdict()
then isinstance(optimizer, Optimizable)
passes and torchopt <> lightning works a charm 😍
Solution
Can we add defaults
and state
attributes to the torchopt.optim.Optimizer
class?
Alternatives
No response
Additional context
No response