Tensorflow implementation for Linear Regression.
This is the code for a single variable linear regression, that is optimising the mean squared error using SGD(Stochastic Gradient Descent). You can find the demo at (https://tensorflow-linear-regression.herokuapp.com/)
The demo is devided into three parts.
As we are using a variant of gradient descent, in the training part of the demo you can provide the training data the learning rate and the amount epochs.
There is also a field for entering an X value to test the model on, that shows the predicted value and the expected value as the default values where suposed to be generated with Y = 6 * X + 2.
The last part is a graph that plots the training data set with the predicted X,Y pair along side the linear regression line.
Express and other depencies for the heroku deployment.
-
To use the default training data to predict on a new X value use the prediction X field to enter the amount.
-
Now in order to provide your own data set you should enter two arrays with the same length.
Example:
You don't need to specify the learning rate and epochs but it is good to mess around with it since it can give u a very cool understanding of deep learning.
- Now for the prediction part you should just enter the X value in the field and it will update the graph and the expected and predicted values.
Example:
Predicting for X = 25 Predicting for X = -10 Predicting for X = 10 Predicting for X = 60
- Now the last part is something called Exploding Gradients which you might notice if you use a large learning rate with a large number of epochs.
As you can see the neural net(even if it is a one layer) outputs a NaN(Not A Number) and is not able to make the data for the chart.
In order to get more info about this probelm here is a very useful link.
Credits go to: