I have the following algorithm and the runtime complexity is O(N^2) but I want to have a deeper understanding of it rather than just memorizing common runtimes.
What wo
What would be the right approach to break it down and analyze it
Take pencil and paper and put down some loops unwraped:
i inner loops per i
-------------------------------
1 length - 1
2 length - 2
.. ..
k length - k
.. ..
length - 1 1
length 0
Now, in order to obtain the total time required, let's sum up the inner loops:
(length - 1) + (length - 2) + ... + (length - k) ... + 1 + 0
It's an arithmetic progression, and its sum is
((length - 1) + 0) / 2 * length == length**2 / 2 - length / 2 = O(length**2)
Let T
= the number of times the inner loop runs.
About half the time, when i<array.length/2
, it runs at least array.length/2
times. So, for about array.length/2 outer iterations, the inner loop runs at least array.length/2 times, therefore:
T >= (N/2)*(N/2)
i.e.,
T >= N²/4
This is in O(N²). Also, though, for all array.length outer iterations, the inner loop runs at most array.length times, so:
T <= N*N, i.e.,
T <= N²
This is also in O(N²). Since we have upper and lower bounds on the time that are both in O(N²), we know that T is in O(N²).
NOTE: Technically we only need to upper bound to show that T is in O(N²), but we're usually looking for bounds as tight as we can get. T is actually in Ө(N²).
NOTE ALSO: there's nothing special about using halves above -- any constant proportion will do. These are the general rules in play:
Lower bound: If you do at least Ω(N) work at least Ω(N) times, you are doing Ω(N²) work. Ω(N)*Ω(N) = Ω(N²)
Upper bound: If you do at most O(N) work at most O(N) times, you are doing O(N²) work. O(N)*O(N) = O(N²)
And since we have both, we can use: Ω(N²) ∩ O(N²) = Ө(N²)