unet_lr is ignored in DAdaptation
guaneec opened this issue · 1 comments
guaneec commented
AFAIK, the lr
values of the param groups other than the first (0th) group aren't actually used in DAdaptAdam
. This makes unet_lr
unused in prepare_optimizer_params
if the text encoder is also trained.
Maybe a warning should be raised for this? Alternatively, we can either patch DAdaptAdam
or manually rescale the params passed to the optimizer.
kohya-ss commented
Thank you for letting me know! The patching seems to be quite difficult, so I've added warning message like this:
when multiple learning rates are specified with dadaptation (e.g. for Text Encoder and U-Net), only the first one will take effect /
D-Adaptationで複数の学習率を指定した場合(Text EncoderとU-Netなど)、最初の学習率のみが有効になります: lr=1.0