Logistic_regression
This is the code represent my strategy trying to implement the gradient descent optimization of Logistic regression
Brief summary about the code
- Only numpy library used to generate array
- Included function for sigmoid, loss function of logistic regression, optimization algorithm
- Included test with sample 1d data
- Parameter initialized randomly
What will you see when you run the code?
As you execute the code via python LogisticRegression.py
, you will run the algorithm within a test data sample and
return the weight, bias and the lost function after 2000 steps as follow:
Weight = -2.5991951619473026
Bias = 0.2703338156154623
Loss after the run = 5.404006857942449
Everytime your run the code you will see the different result, it was because the w was initized randomly in the function
Source
Note:
In addition of the python file, I also provided a jupyter notebook to study the function (or test)