Cross_Entropy

Cross Entropy is telling us how likely events is related to probability. If they are less likely than we have small entropy,otherwise.Lesser the croos entropy, better is the model.

Language

  • Python

Dependencies

  • numpy

I used Sublime text editor. You can use any text editor.

cross+= -((Y[i]*np.log(P[i]))+((1-Y[i])*np.log(1-P[i])))