Issues
- 2
one of the variables needed for gradient computation has been modified by an inplace operation: which is output 0 of MmBackward, is at version 1; expected version 0 instead. Hint: enable anomaly detection to find the operation that failed to compute its gradient, with torch.autograd.set_detect_anomaly(True).
#9 opened by xysong1201 - 1
Training from scratch issues
#5 opened by davidjimenezphd - 15
do i need to modify softmax function?
#3 opened by maryhh - 0
- 0
Will the Caffe version be released?
#7 opened by ysc703 - 5
SV-X-Softmax源码
#6 opened by xisi789 - 5
SV-cosface不收敛
#1 opened by icodingc - 17
There is no code files of this repo?
#4 opened by yangxudong - 2
the code is ready?
#2 opened by twmht