andersbll/nnet

L2 regularization problem

Opened this issue · 2 comments

hello, andersbll:

Thanks for your code. it is very useful for me.
i read your code and want to ask a question.

Line68 in layers.py:
self.dW = np.dot(self.last_input.T, output_grad)/n - self.weight_decay*self.W
In L2 regularization, i think this program need modify into
self.dW = np.dot(self.last_input.T, output_grad)/n + self.weight_decay*self.W
Would you tell me what you think to use "- self.weight_decay*self.W"?

B.R
heibanke

another problem:

helpers.py:

def tanh_d(x):
    e = np.exp(2*x)
    return (e-1)/(e+1)

should modify into following code:

def tanh_d(x):
    e = tanh(x)
    return 1-e**2

B.R
heibanke

983 commented

I was wondering about the minus-sign, too.

Also I am confused about the division by n, although it probably doesn't matter since it only changes the learning rate.

def tanh_d(x):
    e = np.exp(2*x)
    return (e-1)/(e+1)

seems to be the same as tanh, so I think you are right.

One of the reasons why tanh and sigmoid are used as an activation function is that the derivative can be computed from the forward propagation pass without evaluating an expensive function again, but that is not done here.