No key named 'lr_scale' as input parameter to optimizer.AdamW
DanielGeorgeMathew opened this issue · 0 comments
DanielGeorgeMathew commented
Hi, I cant find the lr_scale parameter in the function description for torch.optim.AdamW. Is this a typo or did you mean to pass the parameter groups with lr instead of lr_scale. Shown below is the snippet from optim_factory.py lines 123-132 :
parameter_group_names[group_name] = {
"weight_decay": this_weight_decay,
"params": [],
"lr_scale": scale
}
parameter_group_vars[group_name] = {
"weight_decay": this_weight_decay,
"params": [],
"lr_scale": scale
}