ericdrowell/BigOCheatSheet

Qualitatively: O(2^n) vs O(n^2) are vastly different. (And other labeling issues.)

Opened this issue · 0 comments

ogier commented

If you think about orders of magnitude of difference, I think the buckets ("Horrible", "Bad", "Fair", "Good", "Excellent") are done incorrectly.

O(2^n) and O(n^2) are vastly different, but both are labeled "Horrible." O(2^n) algorithms fall over long before n = 1000, O(n^2) algorithms can be practical and useful for n = 1 million.

O(n log(n)) and O(n) are not very different, but one is "Good" and the other "Fair". Both are practical for roughly the same class of problem. In many cases O(n log(n)) algorithms are preferred in practice over O(n) algorithms for non-complexity reasons, because the theoretical complexity difference is dominated by other factors.

O(log(n)) and O(1) are actually quite different, but both are labeled "Excellent." This is the least worrying of these, but if any factor of log(n) is important enough to distinguish labels, it's the first one. In this case, it's mostly because of CPU caches and memory accesses. Small constant memory access is relatively cheap, O(log(n)) memory accesses can have dramatic effects on cache misses and resulting performance of a whole program.