(CHPT -1) is this the code for backpropagation ?
infinity-void6 opened this issue · 1 comments
infinity-void6 commented
original_weights = deepcopy(weights)
temp_weights = deepcopy(weights)
updated_weights = deepcopy(weights)
original_loss = feed_forward(inputs, outputs, \
original_weights)
for i, layer in enumerate(original_weights):
for index, weight in np.ndenumerate(layer):
temp_weights = deepcopy(weights)
temp_weights[i][index] += 0.0001
_loss_plus = feed_forward(inputs, outputs, \
temp_weights)
grad = (_loss_plus - original_loss)/(0.0001)
updated_weights[i][index] -= grad*lr
return updated_weights, original_loss
losses = []
for epoch in range(100):
W, loss = update_weights(x,y,W,0.01)
losses.append(loss)
infinity-void6 commented
How is this a code for backpropagation? There isn't any chain rule implemented. Isn't this finite difference method implemented using gradient descent?
PS code can be found in backpropagation.ipynb in chpt 1