This weekend I had nothing much to do and did not knew what to do. So I decided to do some random experiment with weights of simple neural network. I bacsicly trained two NN to classify 3 classes : 0,1 and 2. First model was trained on data containing 0 and 1 class only with just one class of 2.Second model was trained on data containing 0 and 2 class only with just one class of 1. Then I tried to combine the weights to improve the performance of the model .
w_0and1 : weight model that was trained on data containing 0 and 1 class.
w_0and2 : weight model that was trained on data containing 0 and 2 class.
w : combined model weight.
Results :
-
*w = alpha*w_(0and1) + (1-\alpha)w_(0and2) :
Clearly combining model in such a way was not a good idea as the accuracy droped significantly.
-
w = alpha(w_(0and1))^2 + (1-\alpha)*(w_(0and2))^2* :
Again I got terrible results š„².
-
w =( (w_0and1)^n+(w_0and2)^n)/2 :
Now I got something intresting . These spikes are very much intresting and till now I have not find any reason to explain this . The pattern is that there is a spike in accuracy at even values of n and have higher accuracies as compared to nearby odd values of n.