I was studying for my final exam and there is a question in the archive that I cannot find its answer:
The order-of-growth of the running time of one
another thing is, some algorithms have a big constant factor.
a O (N^2)
might have a big constant factor that won't make it really practical to use ( if N
is small enough as kindly noted by Thorban)
Adding to the already posted answers I'd like to mention cache behaviour. A particular memory access pattern might be so much slower due to repeated cache misses that a theoretically slower algorithm with a more cache friendly memory access pattern performs much better.
Here is are examples to convince you that O(N³) can be in some cases better than O(N²).
O(N²) algorithm is very complex to code whereas if input size is say N ≤ 100 then for practical use O(N³) can be fast enough
O(N²) has a large constant multiplied to it for example c = 1000 hence for N = 100, c⋅N² = 1000⋅100² = 10⁷ whereas if c = 1 for O(N³) then c⋅N³ = 10⁶
O(N²) algorithm has very high space complexity as compared to O(N³)
The only reason for choosing one algorithm is not the order-of-growth of the running time. You must analyze:
I can think of the following three reasons:
Probably the #1 reason: because the O(N2) algorithm has enough higher constants that for the size of task being contemplated, the O(N3) version is faster.