The paramters in optimizer
lzyhha opened this issue · 0 comments
lzyhha commented
Hello, I note that the order of paramters (params lr wd) in PolyOptimizer is different from official SGD(params lr momentum). So I think the value of wd will actually be assigned to momentum. Is it so?
class PolyOptimizer(torch.optim.SGD):
def __init__(self, params, lr, weight_decay, max_step, momentum=0.9):
super().__init__(params, lr, weight_decay)