A simple classification using linear regression via the gradient descent algorithm.
The two differential equations above in Figure-1 is the partial derivation of the cost function with respect to the gradient of the line m, and the the constant b of the line. These equations are fundamental in gradient descent as the rate of change of the cost function with respect to both m and b allows us to plot the line that best fits through our data points.
The graph diagram above in Figure-2 shows the plots of the x and y values from the data.csv file. Our goal is to find the line of best fit using gradient descent, a line that can go straight through the curve which would find a relationship between the x and the y values.
The graph diagram above in Figure-3 shows a linear line going through the data points, showing us what we want to achieve!