kozistr/pytorch_optimizer

Ranger sign inversion

i404788 opened this issue · 1 comments

Describe the bug

From my experiments it seems like the sign for the Ranger is inverted. All other optimizers (including Ranger21) has steps in the opposite direction of Ranger.

Note that I'm testing context-free step directions/magnitudes using a 'perfect' gradient (scaled by 4), so if Ranger somehow reverts course when gradients from different directions are accumulated that would be missed from my test.
Hyperparameters: {'betas': (0.003344506587403595, 0.9685357345548955), 'lr': 0.4616639698903086} (found through hyperparameter search, also done for the other optimizers) and evaluated on the Ackley (dim=2) function.

(I didn't want to create a PR before discussing if this might be intended)

To Reproduce

  • OS : Linux
  • PyTorch version : 2
  • Python version : 3.11

Log

Ranger:
image

For comparison SGD:
image

hi! sorry for the late reply

the current Ranger implementation in this repository is based on here.

(I might be wrong) I guess there's no big difference between the original implementation and mine. (let me know if I'm wrong)

could you please test with the original implementation and reproduce the same result?

thank you!