/LinearReg-Learner

This code implements linear regression using gradient descent to minimize mean squared error between predicted and actual values. Useful for learning regression or simple implementation in projects.

Primary LanguagePython

LinearReg-Learner

This code implements linear regression using gradient descent to minimize the mean squared error between the predicted values and the actual values. The x_train variable holds the input features, while the y_train variable holds the output labels. The w and b variables are the weight and bias parameters of the linear regression model, respectively.

The main loop performs gradient descent to update the values of w and b iteratively. During each iteration of the loop, the new values of w and b are calculated using the gradients and the learning rate, and the cost function is computed using the updated parameters. The loop prints out the current iteration number, the cost, and the updated values of w and b.

This code could be useful for anyone interested in learning about linear regression or gradient descent, or for anyone who needs a simple implementation of linear regression for their projects.