How can I calculate the time complexity of a recursive algorithm?
int pow1(int x,int n) {
if(n==0){
return 1;
}
else{
return x *
So I'm guessing you're raising x to the power n. pow1 takes O(n).
You never change the value of x but you take 1 from n each time until it gets to 1 (and you then just return) This means that you will make a recursive call n times.
Analyzing recursive functions (or even evaluating them) is a nontrivial task. A (in my opinion) good introduction can be found in Don Knuths Concrete Mathematics.
However, let's analyse these examples now:
We define a function that gives us the time needed by a function. Let's say that t(n)
denotes the time needed by pow(x,n)
, i.e. a function of n
.
Then we can conclude, that t(0)=c
, because if we call pow(x,0)
, we have to check whether (n==0
), and then return 1, which can be done in constant time (hence the constant c
).
Now we consider the other case: n>0
. Here we obtain t(n) = d + t(n-1)
. That's because we have again to check n==1
, compute pow(x, n-1
, hence (t(n-1)
), and multiply the result by x
. Checking and multiplying can be done in constant time (constant d
), the recursive calculation of pow
needs t(n-1)
.
Now we can "expand" the term t(n)
:
t(n) =
d + t(n-1) =
d + (d + t(n-2)) =
d + d + t(n-2) =
d + d + d + t(n-3) =
... =
d + d + d + ... + t(1) =
d + d + d + ... + c
So, how long does it take until we reach t(1)
? Since we start at t(n)
and we subtract 1 in each step, it takes n-1
steps to reach t(n-(n-1)) = t(1)
. That, on the other hands, means, that we get n-1
times the constant d
, and t(1)
is evaluated to c
.
So we obtain:
t(n) =
...
d + d + d + ... + c =
(n-1) * d + c
So we get that t(n)=(n-1) * d + c
which is element of O(n).
pow2
can be done using Masters theorem. Since we can assume that time functions for algorithms are monotonically increasing. So now we have the time t(n)
needed for the computation of pow2(x,n)
:
t(0) = c (since constant time needed for computation of pow(x,0))
for n>0
we get
/ t((n-1)/2) + d if n is odd (d is constant cost)
t(n) = <
\ t(n/2) + d if n is even (d is constant cost)
The above can be "simplified" to:
t(n) = floor(t(n/2)) + d <= t(n/2) + d (since t is monotonically increasing)
So we obtain t(n) <= t(n/2) + d
, which can be solved using the masters theorem to t(n) = O(log n)
(see section Application to Popular Algorithms in the wikipedia link, example "Binary Search").
Let's just start with pow1, because that's the simplest one.
You have a function where a single run is done in O(1). (Condition checking, returning, and multiplication are constant time.)
What you have left is then your recursion. What you need to do is analyze how often the function would end up calling itself. In pow1, it'll happen N times. N*O(1)=O(N).
For pow2, it's the same principle - a single run of the function runs in O(1). However, this time you're halving N every time. That means it will run log2(N) times - effectively once per bit. log2(N)*O(1)=O(log(N)).
Something which might help you is to exploit the fact that recursion can always be expressed as iteration (not always very simply, but it's possible. We can express pow1 as
result = 1;
while(n != 0)
{
result = result*n;
n = n - 1;
}
Now you have an iterative algorithm instead, and you might find it easier to analyze it that way.
Complexity of both functions ignoring recursion is O(1)
For the first algorithm pow1(x, n) complexity is O(n) because the depth of recursion correlates with n linearly.
For the second complexity is O(log n). Here we recurse approximately log2(n) times. Throwing out 2 we get log n.
It can be a bit complex, but I think the usual way is to use Master's theorem.