The optimizer of clients is created every epoch?
tengerye opened this issue · 3 comments
tengerye commented
Hi, thanks for the code.
According to the lines in update.py
:
if self.args.optimizer == 'sgd':
optimizer = torch.optim.SGD(model.parameters(), lr=self.args.lr,
momentum=0.5)
elif self.args.optimizer == 'adam':
optimizer = torch.optim.Adam(model.parameters(), lr=self.args.lr,
weight_decay=1e-4)
The optimizer is created for every epoch, is that correct?
AshwinRJ commented
Yes, it is created during every global iteration.
tengerye commented
@AshwinRJ Thank you for your kind reply. May I ask the reason? It is counter-intuitive since the learning rate of optimizer is reset during the training.
zdhNarsil commented
I also have the same question. How can you set lr decay in this setting?