Jacobian matrix?
drpeppurr opened this issue · 3 comments
Hello, this package looks quite interesting & promising. I took a quick look at the examples and the wrapper. It doesn't look like NumbaLSODA is taking Jacobian matrix?
I did a few comparisons with scipy's solve_ivp, with BDF method and jacobian matrix provided. For small sized (say, with a few hundreds of ODEs), NumbaLSODA is indeed much faster but the performance of solve_ivp (BDF+jacobian) isn't totally unacceptable. For larger sized problem (say, with thousands of ODEs) NumbaLSODA is either marginally faster or slower - okay this isn't really fair, since in those cases with thousands of ODEs, with/without jacobian makes a huge difference in solve_ivp. NumbaLSODE would probably still outperform if solve_ivp is with LSODA and without Jacobian.
But I'd imagine jacobian may further improve the performance?
Thank you again.
Thanks for running those tests. Adding an option for an analytical jacobian should have a pretty massive speed upgrade.
I’ll look into it. Probably won’t have time to try to implement it for a month or so at least.
Another related problem I have noticed is the speed of the linear algebra routines. This package uses home brewed linear algebra routines, which are probably causing big slowdowns for large matrix inversions. Scipy links to either OpenBLAS or Intel MKL. It would be best for NumbaLSODA to do the same.
Interesting discussion.
Any news on the analytical jacobian?
I have no time right now to work on this sort of thing sorry :(