问题
So, clearly, log(n) is O(n). But, what about (log(n))^2? What about sqrt(n) or log(n)--what bounds what?
There's a family of comparisons like this:
n^a versus (log(n))^b
I run into these comparisons a lot, and I've never come up with a good way to solve them. Hints for tactics for solving the general case?
Thanks,
Ian
EDIT: I'm not talking about the computational complexity of calculating the values of these functions. I'm talking about the functions themselves. E.g., f(n)=n is an upper bound on g(n)=log(n) because f(n)<=c*g(n) for c=1 and n0 > 0.
回答1:
log(n)^a is always O(n^b), for any positive constants a, b.
Are you looking for a proof? All such problems can be reduced to seeing that log(n) is O(n), by the following trick:
log(n)^a = O(n^b) is equivalent to: log(n) = O(n^{b/a}), since raising to the 1/a power is an increasing function. This is equivalent to log(m^{a/b}) = O(m), by setting m = n^{b/a}. This is equivalent to log(m) = O(m), since log(m^{a/b}) = (a/b)*log(m).
You can prove that log(n) = O(n) by induction, focusing on the case where n is a power of 2.
回答2:
I run into these comparisons a lot (...) Hints for tactics for solving the general case?
As you as about general case and that you following a lot into such questions. Here is what I recommend :
Use limit definition of BigO notation, once you know:
f(n) = O(g(n)) iff limit (n approaches +inf) f(n)/g(n) exists and is not +inf
You can use Computer Algebra System, for example opensource Maxima, here is in Maxima documentation about limits .
For more detailed info and example - check out THIS answer
回答3:
log n -- O(log n)
sqrt n -- O(sqrt n)
n^2 -- O(n^2)
(log n)^2 -- O((log n)^2)
n^a
versus (log(n))^b
You need either bases or powers the same. So use your math to change n^a
to log(n)^(whatever it gets to get this base)
or (whatever it gets to get this power)^b
. There is no general case
来源:https://stackoverflow.com/questions/7882915/asymptotic-complexity-of-logarithms-and-powers