mk-minchul/AdaFace

Problem fine-tuning adaface with triplet loss

sebapulgar opened this issue · 2 comments

Hi!,

I am trying to do fine-tuning of the last layer of the adaface model, for this leaving frozen all the weights of the network, except those belonging to the last layers. The problem is that the loss does not decrease, instead if I do not leave any frozen weight the model learns in a good way, however, I want to use the knowledge learned by the network to be trained with adaface, so I want to keep the weights frozen.
I would like to know if someone understands what happens, I have the impression that there is no compatibility between the embeddings, since adaface is trained with a classification logic and not with a contrastive logic as in triplet loss, but it does not make sense to me since in both cases the embeddings of the same person are close in the hyperplane, so there should not be problems.
*I have the same problem with any model trained with classification losses, arcface, shpere face, etc. but not with contrastive models like facenet.

Regards

Hey @sebapulgar , I'll probably be working on training AdaFace with triplet loss as a requirement for one of my research work. Will keep the thread posted if I get any issues/solutions.

Heyy @sebapulgar , can you help me figuring out how to train ada face using triplet loss?