Asymptotic complexity for typical expressions

耗尽温柔 提交于 2019-12-05 04:06:09

问题


The increasing order of following functions shown in the picture below in terms of asymptotic complexity is:

(A) f1(n); f4(n); f2(n); f3(n)

(B) f1(n); f2(n); f3(n); f4(n);

(C) f2(n); f1(n); f4(n); f3(n)

(D) f1(n); f2(n); f4(n); f3(n)

a)time complexity order for this easy question was given as--->(n^0.99)*(logn) < n ......how? log might be a slow growing function but it still grows faster than a constant

b)Consider function f1 suppose it is f1(n) = (n^1.0001)(logn) then what would be the answer?

whenever there is an expression which involves multiplication between logarithimic and polynomial expression , does the logarithmic function outweigh the polynomial expression?

c)How to check in such cases suppose

1)(n^2)logn vs (n^1.5) which has higher time complexity? 2) (n^1.5)logn vs (n^2) which has higher time complexity?


回答1:


If we consider C_1 and C_2 such that C_1 < C_2, then we can say the following with certainty

(n^C_2)*log(n) grows faster than (n^C_1)

This is because (n^C_1) grows slower than (n^C_2) (obviously)

also, for values of n larger than 2 (for log in base 2), log(n) grows faster than 
1.

in fact, log(n) is asymptotically greater than any constant C,
because log(n) -> inf as n -> inf

if both (n^C_2) is asymptotically than (n^C_1) AND log(n) is asymptotically greater 
than 1, then we can certainly say that 
(n^2)log(n) has greater complexity than (n^1.5)

We think of log(n) as a "slowly growing" function, but it still grows faster than 1, which is the key here.


coder101 asked an interesting question in the comments, essentially,

is n^e = Ω((n^c)*log_d(n))?
where e = c + ϵ for arbitrarily small ϵ

Let's do some algebra.

n^e = (n^c)*(n^ϵ)
so the question boils down to
is n^ϵ = Ω(log_d(n))
or is it the other way around, namely:
is log_d(n) = Ω(n^ϵ)

In order to do this, let us find the value of ϵ that satisfies n^ϵ > log_d(n).

n^ϵ > log_d(n)
ϵ*ln(n) > ln(log_d(n))
ϵ > ln(log_d(n)) / ln(n)

Because we know for a fact that

ln(n) * c > ln(ln(n))        (1)
as n -> infinity

We can say that, for an arbitrarily small ϵ, there exists an n large enough to 
satisfy ϵ > ln(log_d(n)) / ln(n)

because, by (1), ln(log_d(n)) / ln(n) ---> 0 as n -> infinity.

With this knowledge, we can say that

is n^ϵ = Ω(log_d(n))
for arbitrarily small ϵ

which means that
n^(c + ϵ) = Ω((n^c)*log_d(n))
for arbitrarily small ϵ.

in layperson's terms

n^1.1 > n * ln(n)
for some n

also

n ^ 1.001 > n * ln(n)
for some much, much bigger n

and even
n ^ 1.0000000000000001 > n * ln(n)
for some very very big n.



回答2:


Replacing f1 = (n^0.9999)(logn) by f1 = (n^1.0001)(logn) will yield answer (C): n, (n^1.0001)(logn), n^2, 1.00001^n

The reasoning is as follows:

. (n^1.0001)(logn) has higher complexity than n, obvious.

. n^2 higher than (n^1.0001)(logn) because the polynomial part asymptotically dominates the logarithmic part, so the higher-degree polynomial n^2 wins

. 1.00001^n dominates n^2 because the 1.00001^n has exponential growth, while n^2 has polynomial growth. Exponential growth asymptotically wins.

BTW, 1.00001^n looks a little similar to a family called "sub-exponential" growth, usually denoted (1+Ɛ)^n. Still, whatever small is Ɛ, sub-exponential growth still dominates any polynomial growth.



来源:https://stackoverflow.com/questions/30811753/asymptotic-complexity-for-typical-expressions

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!