Did the encoder training parameters setting in the paper refer to the pretrain parameters setting of the encoder?
Howie86 opened this issue · 2 comments
Howie86 commented
In the last paragraph of "5.1. Experimental setup" in the paper: "Adam optimizer with 2e-4 learning rate, 100 train epochs, 32 mini-batch size for encoder and cosine learning rate annealing with 2 warm-up epochs."
Did these training parameters of encoder refer to pretrain parameters? In train.py, encoder only loaded the pretrained weights on ImageNet and did not participate in the training process.
Thanks a lot.