Linear Regression algorithm for two- and multi-dimensional feature space
The first program implements Linear Regression for two variables using Gradient Descent to fit a linear line to the date and predict the dependent variable for new unseen independent variables. In the output, you can see the graph of training data points and the fitted linear line, the dynamics of Cost Function across iteration number, and finally surface and contour plot of the Cost Function.
The second program implements Linear Regression for multi variables. It gives theta using two methods: Gradient Descent and Normal Equation. The graph demonstrates the evolution of Cost Function across iterations.