No CELF
Opened this issue · 0 comments
sepasfr commented
We need to add a cross entropy loss function. The reason being is because cross entropy loss, or log loss, is a measure used in classification tasks to quantify the difference between predicted probabilities and actual class labels. It penalizes incorrect predictions more strongly, encouraging the model to assign higher confidence to the correct class. The formula involves taking the negative logarithm of the predicted probability assigned to the true class. Lower cross entropy indicates better model performance.