/Gradient-descent-feature-scaling

A way to speed up gradient descent is having each feature in the same range.

Primary LanguageMATLABGNU General Public License v3.0GPL-3.0

Gradient descent feature scaling

Abstract

A way to speed up gradient descent is having each feature in the same range. There are two ways to do that, one is feature scaling, and the other is mean normalization. We can mix together the two techniques using this formula:

equation

Being equation = equation or equation = equation.

And being equation the number of the feature.

So in this repo you will find a vectorized implementation of the above described.