chapter14 -Problem in Regularization_loss
Ali-Mohammadi65748 opened this issue · 0 comments
Ali-Mohammadi65748 commented
Hi in our final code we calculated Regularization Loss and then sum it with
data_Loss. Now the problem is I think we didn't add our total Loss as a final output of the last layer and then multiply partial gradients with it and instead what we did is just propagated Loss like when we don't have regularization loss.
(In summary, we didn't add the effect of Regularization Loss in forward pass phase).
And I think we should use this
loss_activation.backward(Loss, y)
instead of
loss_activation.backward(loss_activation.output, y).
Would be glad to receive your opinion.
Best Regards