YunYang1994/SphereFace

what is the meaning of ff, f, l ?

qiyang77 opened this issue · 3 comments

hello, I am a bit of confuse about reading this code about calculating A-softloss

you'd better refer the Large-Margin Softmax Loss for Convolutional Neural network, which explains the lambda's meanings.

but, here,the code.
l, f, ff is constant.which isn't right.
l = 0.
f = 1.0/(1.0+l)
ff = 1.0 - f

the l should depends at the global_steps.

Hello, I change the "l" crosspond to the training step, and let angular-softmax with a very small part at beginning and about 0.1 ratio at the end. however, it seems strange, the loss decrease normally at beginning and start increase at some training step, with a-softmax increase. and the acc decrease too, have anyone meet this problem? If anyone have some trick about the raito decay? ps: I have fixed the code to avoid gradient explosion problem.