NaturalNode/apparatus

Inifinite learningRateFound false.

mvillaizan opened this issue · 1 comments

When training a logistic regression classifier with the following single document:

data: 'If you met a Genie who offered you three wishes, what would you wish for? more wishes does not count'
classification: 'If you met a Genie who offered you three wishes, what would you wish for? more wishes does not count'

The function descendGradient never sets learningRateFound to true and it blocks the process.

Any ideas why? Does tweaking the learningRate value have any impact?

function descendGradient(theta, Examples, classifications) {
    var maxIt = 500 * Examples.rows();
    var last;
    var current;
    var learningRate = 3;
    var learningRateFound = false;

    Examples = Matrix.One(Examples.rows(), 1).augment(Examples);
    theta = theta.augment([0]);

    while(!learningRateFound) {
        var i = 0;
        last = null;

        while(true) {
            var hypothesisResult = hypothesis(theta, Examples);
            theta = theta.subtract(Examples.transpose().x(
            hypothesisResult.subtract(classifications)).x(1 / Examples.rows()).x(learningRate));
            current = cost(theta, Examples, classifications);

            i++;

            if(last) {
            if(current < last)
                learningRateFound = true;
            else
                break;

            if(last - current < 0.0001)
                break;
            }

            if(i >= maxIt) {
                throw 'unable to find minimum';
            }

            last = current;
        }

        learningRate /= 3;
    }

    return theta.chomp(1);
}

just merged the fix