Levigty/AimCLR

About KNN evaluation and finetune settings

Closed this issue · 2 comments

Hi, Thank you for sharing your work! You used KNN evaluation method in your paper, but I did not find the related implementation details in the article, nor did I find the corresponding code. Would please provide some more details of the KNN evaluation? And as to the finetune setting, did you adjust the weight decay when doing finetune? Looking forward to your reply

Hi. For the KNN evaluation, we use the knn_monitor like other contrastive learning methods. For finetune setting, we set weight decay to 1e-4, learning rate 0.01 or 0.001.

Thanks a lot for your help!