Crash in UpdateLearningRate
bratao opened this issue · 5 comments
Hello !
Testing this last commit, I get a crash in UpdateLearningRate
Unhandled Exception: System.AggregateException: One or more errors occurred. ---> System.AggregateException: One or more errors occurred. ---> System.IndexOutOfRangeException: Index was outside the bounds of the array.
at RNNSharp.RNN.UpdateLearningRate(Matrix`1 m, Int32 i, Int32 j, Double delta) in C:\sbuild\mine\backend_broka\Candidatos\RNNSharp-master\RNNSharp\RNN.cs:line 101
at RNNSharp.LSTMRNN.<LearnOutputWeight>b__25_0(Int32 i) in C:\sbuild\mine\backend_broka\Candidatos\RNNSharp-master\RNNSharp\LSTMRNN.cs:line 609
In LSTMRNN , Line 601 do this:
Parallel.For(0, L1, parallelOption, i =>
{
double cellOutput = neuHidden[i].cellOutput;
for (int k = 0; k < L2; k++)
{
double delta = NormalizeGradient(cellOutput * OutputLayer.er[k]);
double newLearningRate = UpdateLearningRate(Hidden2OutputWeightLearningRate, i, k, delta);
Hidden2OutputWeight[k][i] += newLearningRate * delta;
}
});
But in line 517 , it is initialized like :
Hidden2OutputWeightLearningRate = new Matrix<float>(L2, L1);
Thanks, Bratao.
This line has a bug: double newLearningRate = UpdateLearningRate(Hidden2OutputWeightLearningRate, i, k, delta); at LSTMRNN.cs line #607.
It should be double newLearningRate = UpdateLearningRate(Hidden2OutputWeightLearningRate, k, i, delta);
You can modify it in your private build, I will fix it tonight.
Thanks
Zhongkai Fu
The bug has been mitigated. Could you please try it again ?
@zhongkaifu , thank you !!! It's not crashing anymore.
However, compared to yesterday version, my LSTM do not converge anymore =(
Using RNN it gave me a crash. I created another issue reporting this bug.
@bratao , thanks for letting me know another problem.
I have reverted the previous change list, and checked-in LSTM crashing fix only along with a few of dynamic learning rate optimization. Could you please try it again ? Thanks in advance.
It is working great !!! Thank you so much !!