youngminPIL/rollback

Some doubts about training process

Closed this issue · 2 comments

Hi, it is an interesting and enlightening work. But i have some doubts, first, why don't you just set the learning rate of block 4,3,2 to zero to intently train the block 1. It may be a substitute of rolling back block 2,3,4. second, why do not join triplet loss in the training process.
TAHNKS for your time to read this message.

Thank you for your interest in my research.

  1. It seems to be a meaningful attempt. I thought it was difficult to ignore the continuity between each layer in the learning of the deep network, so I gave a learning rate to every layer.
  2. I didn't use triplet loss because it converges slowly and sampling method is tricky.

Thank you for your interest in my research.

  1. It seems to be a meaningful attempt. I thought it was difficult to ignore the continuity between each layer in the learning of the deep network, so I gave a learning rate to every layer.
  2. I didn't use triplet loss because it converges slowly and sampling method is tricky.

Thanks for your reply, I will close the issue