Started learning algorithms. I understand how to find theta-notation from a 'regular recurrence' like T(n) = Tf(n) + g(n)
. But I am lost with this recurrence: problem 1-2e:
T(n) = T(√n) + Θ(lg lg n)
How do I choose the method to find theta? And what, uh, this recurrence is? I just do not quite understand notation-inside-a-recurrence thing.
One trick that might useful would be to transform n into something else, like, say, 2k. If we do this, you can rewrite the above as
T(2k) = T(2k/2) + Θ(log log 2k)
= T(2k) = T(2k/2) + Θ(log k)
Now this looks like a recurrence that we might actually be able to solve, since we can expand this out as
T(2k) = T(2k/2) + log k = T(2k/4) + log (k/2) + log k
If we expand this out i times, we get
T(2i) = T(2k/2i) + log k + log (k/2) + log (k/4) + ... + log (k/2i)
This recurrence terminates when 2k/2i ≤ 2 (say, at which case we reach a base case), which happens when
2k/2i = 2
k / 2i = 1
k = 2i
i = lg k
In other words, if we can write n = 2k, then the net result will be
T(n) = lg k + lg (k/2) + log (k/4) + log(k/8) + ... 1
lg k + (lg k) - 1 + (lg k) - 2 + (lg k) - 3 + ... + (lg k) - lg k
= Θ((lg k)2)
And since we know that n = 2k, this means that k = Θ(log n), so substituting this in we get that T(n) = Θ((log log n)2).
The key trick here was rewriting n as 2k. The rest is standard technique.
So does this make sense? Well, if you think about it, log log n is, among other things, the number of bits required to write out log n. At each iteration, you're taking the square root of the number, which halves the number of bits in its representation. This decreases the number of bits required to write out the number of bits in in by one. Consequently, the first iteration will write out log log n bits, the second (log log n) - 1, the third (log log n) - 2, etc. Overall, this summation is Θ((log log n)2), which matches the intuition.
Hope this helps!
Here is how to solve it using math. I will be using lnln(n)
instead of O(lnln(n))
. This is mostly to reduce the length of the formulae and you can do absolutely the same with big-O. So:
Which means that:
now to transform this big summation notice that
The whole lnln(n)
sum can be transformed as:
And our only problem is to find some connection between n
and k
, which can be easily derived from the latest T(...)
term.
To do this we have to to find a reasonable bound condition for the latest term. This can be done by trying a couple of integers like 0, 1, 2
. With 2
you have:
Substituting k to our previous equation you will see that the biggest term is:
and therefore the complexity is:
P.S. you can see a solution to a similar recurrence here
来源:https://stackoverflow.com/questions/11149168/solve-recurrence-tn-tn1-2-%ce%98lg-lg-n