So the answer to this question What is the difference between Θ(n) and O(n)?
states that "Basically when we say an algorithm is of O(n), it's also O(n2), O(n1000000), O(2n), ... but a Θ(n) algorithm is not Θ(n2)."
I understand Big O to represent upper bound or worst case with that I don't understand how O(n) is also O(n2) and the other cases worse than O(n).
Perhaps I have some fundamental misunderstandings. Please help me understand this as I have been struggling for a while.
Thanks.
It's helpful to think of what big-Oh means: if a function is O(n), then c*n
, where c is some positive number, is the upper-bound. If c*n
is an upper-bound, it's clear that for integers, c*n^2
would also be an upper-bound. Also c*n^3
, c*n^4
, c*n^1000
, etc.
The below graph shows the growth of functions, which are upper bounds of the function "to the right" of it; i.e., it grows faster on smaller n
.
Suppose the running time of your algorithm is T(n) = 3n + 6
(i.e., an arbitrary polynomial of order 1).
It's true that T(n) = O(n)
because 3n + 6 < 4n
for all n > 5
(to use the definition of big-oh notation). It's also true that T(n) = O(n^2)
because 3n + 6 < n^2
for all n > 5
(to use the defintion again).
It's also true that T(n) = Θ(n)
because, in addition to the proof that it was O(n)
, it is true that 3n + 6 > n
for all n > 1
. However, you cannot prove that 3n + 6 > c n^2
for any value of c
for arbitrarily large n
. (Proof sketch: lim (cn^2 - 3n - 6) > 0 as n -> infinity).
I understand Big O to represent upper bound or worst case with that I don't understand how O(n) is also O(n2) and the other cases worse than O(n).
Intuitively, an "upper bound of x
" means that something will always be less than or equal to x. If something is less than or equal to x
, it is also less than or equal to x^2
and x^1000
, for large enough values of x
. So x^2
and x^1000
can also be upper bounds.
This is what Big-oh represents: upper bounds.
When we say that f(n) = O(g(n)), we mean only that for all sufficiently large n, there exists a constant c such that f(n) <= cg(n). Note that if f(n) = O(g(n)), we can always choose a function h(n) bigger than g(n) and since g(n) is eventually less than h(n), we have f(n) <= cg(n) <= ch(n), so f(n) = O(h(n)) as well.
Note that the O bound is not tight. The theta bound is the intersection of O(g(n)) and Omega(g(n)), where Omega gives the lower bound (it's like O, the upper bound, but bounds from below instead). If f(n) is bounded below by g(n), and h(n) is bigger than g(n), then if follows that f(n) is not (necessarily) bounded below by h(n).
来源:https://stackoverflow.com/questions/32312987/how-can-an-algorithm-that-is-on-also-be-on2-on1000000-o2n