gudovskiy/cflow-ad

Did the encoder training parameters setting in the paper refer to the pretrain parameters setting of the encoder?

Howie86 opened this issue · 2 comments

In the last paragraph of "5.1. Experimental setup" in the paper: "Adam optimizer with 2e-4 learning rate, 100 train epochs, 32 mini-batch size for encoder and cosine learning rate annealing with 2 warm-up epochs."
Did these training parameters of encoder refer to pretrain parameters? In train.py, encoder only loaded the pretrained weights on ImageNet and did not participate in the training process.
Thanks a lot.

@Howie86 it is a typo (should be a decoder), encoder is always fixed.

@Howie86 it is a typo (should be a decoder), encoder is always fixed.

Thanks for your reply. Now I understand.