harrisiirak/cron-parser

1200% slower performance of next() and prev() upgrading to 4.6 from 2.3

hueniverse opened this issue · 7 comments

We recently upgraded the module from 2.3.7 to 4.6.0 and saw a huge performance degradation. When calling the next() method 10000 times, our benchmark went from 5000ms to 68000ms. Will provide more info as we dig deeper.

Hi @hueniverse!

Thanks for reporting this. 12x performance degradation doesn't sound good indeed.

I tested it briefly in my local environment and saw 3x - 5x slowdown, depending of the input expression. Individual next / prev calls still takes mostly less than 1ms.

However, I tested my main hypothesis and benchmarked 2.18.0 and 3.0.0 (code diff) versions. There is a drastic performance degradation in 3.0.0, as we switched from moment to luxon. I expected luxon to be faster than moment.

luxon@v2 is also way slower than luxon@v1 - pinned 3.0.0 to use luxon@v2 and saw even worse performance. I'll investigate if there is something wrong with our luxon usage.

Thanks @harrisiirak. Lmk if I can help.

Just a quick status update.

I looked more in-depth at luxon's inner-workings and how it handles date-time (and timezone) calculations - I don't think there is anything that could be improved, in sense how luxon used in this library.

However, I think we can greatly reduce actual calls we're making to calculate/create new date objects during the iteration process. When the date is iterated, all the calculations happens in 1-step (second, minute, hour etc) interval increments/decrements. This is lazy approach and creates lots of unnecessary (which is bad for the performance) date related operations. The solution could be jump over all the date component values that we can't match anyway. I've created local POC but as expected, some of the tests are now failing - I'll look into this when I've a bit more time (and brainpower).

@harrisiirak Do you have a status update on this issue?

Hi @threes-was-taken!

Sadly, I haven't had time to look into this issue further.

@harrisiirak
Hi! Thanks for the quick response!
No problem at all, was just wondering since we're facing these issues as well :)

hi! sorry, any updates for this issue?