harthur/brain

Error Calculation Incorrect

countable opened this issue · 1 comments

in neuralnetwork.js, there are a couple places the error vectors are not properly normalized. They're being mapped to the interval [0,1/sqrt(N)] instead of [0,1] . The result for very large data sets is error reported noticeably smaller than it should be (ie, during training).

line 181 in Layer.prototype

return Math.sqrt(sum / this.getSize())

instead of

return Math.sqrt(sum) / this.getSize()

to check this, note that sqrt(sum_of_errors) is at most sqrt(size_of_layer)

line 73 in train(), should be

error = Math.sqrt(sum / data.length); // mean squared error

instead of

error = Math.sqrt(sum) / data.length; // mean squared error

to check this, note that sqrt(sum) is at most sqrt(data.length)

Wow, thanks for the catch!