Training fails to converge
Closed this issue · 2 comments
xiaohuyz commented
I use the dafault data but only change the cores used. However, the Training floats at 60%. And when i only change the algorithm to Adam, it even get worse. the training stays at 12% and cannot improve.
neurosim commented
For Adam, you might need to reduce the learning rate to like 0.1, 0.05. The default data (a-Ag RRAM) has pretty bad non-linearity, it might not converge very well.
xiaohuyz commented
thanks. I see.
At 2022-03-26 04:40:58, "NeuroSim" ***@***.***> wrote:
For Adam, you might need to reduce the learning rate to like 0.1, 0.05. The default data (a-Ag RRAM) has pretty bad non-linearity, it might not converge very well.
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you authored the thread.Message ID: ***@***.***>