/gradient-descent

Implementation and visualization of Gradient descent Algorithm

Primary LanguageJupyter NotebookGNU General Public License v3.0GPL-3.0

Gradient-Descent

Implementation and visualization of Gradient descent Algorithm

Implement the following formulas, as explained in the text.

  Sigmoid activation function
  
        𝜎(π‘₯)=1/(1+π‘’βˆ’π‘₯)

  Output (prediction) formula
  
         𝑦̂ =𝜎(𝑀1π‘₯1+𝑀2π‘₯2+𝑏)

  Error function
  
        πΈπ‘Ÿπ‘Ÿπ‘œπ‘Ÿ(𝑦,𝑦̂ )=βˆ’π‘¦log(𝑦̂ )βˆ’(1βˆ’π‘¦)log(1βˆ’π‘¦Μ‚ )

  The function that updates the weights
  
      π‘€π‘–βŸΆπ‘€π‘–+𝛼(π‘¦βˆ’π‘¦Μ‚ )π‘₯𝑖
      π‘βŸΆπ‘+𝛼(π‘¦βˆ’π‘¦Μ‚ )

Initial plot of data

Test Image 1

After Gradient Descent

Test Image 1 Test Image 1

Error rate graph

Test Image 1 Test Image 1