Issues
- 2
lr below min_lr check too aggressive
#16 opened by kai-tub - 1
Error when using DDP
#49 opened by smartbarbarian - 2
error in warmdown - lr below min lr. current lr = 2.999999999999997e-0518 [07:50<00:04, 4.66s/it] auto handling but please report issue!
#28 opened by neuronflow - 0
learning rate scheduler
#48 opened by shyhyawJou - 1
Recommended settings for transformers?
#47 opened by OhadRubin - 2
Require an documentation
#38 opened by huangnengCSU - 2
What it the best hyper-parameter setting?
#42 opened by NoOneUST - 1
sample usage in fastai
#35 opened by nikky4D - 0
hi,please help me
#46 opened by wudizuixiaosa - 3
resuming training with ranger21?
#18 opened by neuronflow - 0
Nice name of your project)
#45 opened by Ranger21 - 0
- 0
- 2
Can ranger be used for NLP transformers?
#40 opened by LifeIsStrange - 0
Not support pytorch _1.3.1
#39 opened by huangnengCSU - 7
RuntimeError: hit nan for variance_normalized
#30 opened by gcp - 5
decouple the lr scheduler and optimizer?
#36 opened by hiyyg - 7
hit nan for variance_normalized
#19 opened by jimmiebtlr - 0
local variable 'neg_grad_ma' referenced before assignment when momentum_type is not "pnm"
#34 opened by lechmazur - 0
About gradient normalization
#29 opened by julightzhong10 - 4
Multi GPU problem
#17 opened by zsgj-Xxx - 1
Performance of ResNet50 on ImageNet
#27 opened by juntang-zhuang - 0
- 1
File "/home/.../site-packages/ranger21/ranger21.py", line 680, in step raise RuntimeError("hit nan for variance_normalized")
#24 opened by neuronflow - 2
comparing ranger21 to SAM optimizer
#21 opened by nikky4D - 0
error when training with batch_size = 1
#20 opened by neuronflow - 6
optimizer = Ranger21(params=model.parameters(), lr=learning_rate) File "/mnt/Drive1/florian/msblob/Ranger21/ranger21/ranger21.py", line 179, in __init__ self.total_iterations = num_epochs * num_batches_per_epoch TypeError: unsupported operand type(s) for *: 'NoneType' and 'NoneType'
#12 opened by neuronflow - 6
- 3
Adaptive Gradient Clipping
#10 opened by benihime91 - 7
Changes in lr
#9 opened by zsgj-Xxx - 2
torch.grad removed in PyTorch 1.8.1?
#8 opened by jszym - 2
Augmentation requests
#5 opened by LifeIsStrange - 1
Example
#3 opened by johnyquest7 - 2
Adaptive Gradient Clipping
#1 opened by kayuksel