327
votes

I have a Computer Science Midterm tomorrow and I need help determining the complexity of these recursive functions. I know how to solve simple cases, but I am still trying to learn how to solve these harder cases. These were just a few of the example problems that I could not figure out. Any help would be much appreciated and would greatly help in my studies, thank you!

int recursiveFun1(int n)
{
    if (n <= 0)
        return 1;
    else
        return 1 + recursiveFun1(n-1);
}

int recursiveFun2(int n)
{
    if (n <= 0)
        return 1;
    else
        return 1 + recursiveFun2(n-5);
}

int recursiveFun3(int n)
{
    if (n <= 0)
        return 1;
    else
        return 1 + recursiveFun3(n/5);
}

void recursiveFun4(int n, int m, int o)
{
    if (n <= 0)
    {
        printf("%d, %d\n",m, o);
    }
    else
    {
        recursiveFun4(n-1, m+1, o);
        recursiveFun4(n-1, m, o+1);
    }
}

int recursiveFun5(int n)
{
    for (i = 0; i < n; i += 2) {
        // do something
    }

    if (n <= 0)
        return 1;
    else
        return 1 + recursiveFun5(n-5);
}
6
If you don't want to go through the analysis every time, there is a black box technique called the Master method. But with the assumption that all recursive splits of inputs are of equal size in each instance.Vivek Krishna
To describe 5 : O(f(n)) = T(n/2) ... T((n-5)/2) ... T((n-10)/2)...1 so the height of the tree would be n/5. So that would give you O(f(n)) takes T((n/5 * n/2) - (5/2 *n/5)) so bound on the input n, in the worst case the recursive case would take O(2^N). Also in the best case and the average case.Anthony Moon Beam Toorie

6 Answers

441
votes

The time complexity, in Big O notation, for each function:


int recursiveFun1(int n)
{
    if (n <= 0)
        return 1;
    else
        return 1 + recursiveFun1(n-1);
}

This function is being called recursively n times before reaching the base case so its O(n), often called linear.


int recursiveFun2(int n)
{
    if (n <= 0)
        return 1;
    else
        return 1 + recursiveFun2(n-5);
}

This function is called n-5 for each time, so we deduct five from n before calling the function, but n-5 is also O(n). (Actually called order of n/5 times. And, O(n/5) = O(n) ).


int recursiveFun3(int n)
{
    if (n <= 0)
        return 1;
    else
        return 1 + recursiveFun3(n/5);
}

This function is log(n) base 5, for every time we divide by 5 before calling the function so its O(log(n))(base 5), often called logarithmic and most often Big O notation and complexity analysis uses base 2.


void recursiveFun4(int n, int m, int o)
{
    if (n <= 0)
    {
        printf("%d, %d\n",m, o);
    }
    else
    {
        recursiveFun4(n-1, m+1, o);
        recursiveFun4(n-1, m, o+1);
    }
}

Here, it's O(2^n), or exponential, since each function call calls itself twice unless it has been recursed n times.



int recursiveFun5(int n)
{
    for (i = 0; i < n; i += 2) {
        // do something
    }

    if (n <= 0)
        return 1;
    else
        return 1 + recursiveFun5(n-5);
}

And here the for loop takes n/2 since we're increasing by 2, and the recursion takes n/5 and since the for loop is called recursively, therefore, the time complexity is in

(n/5) * (n/2) = n^2/10,

due to Asymptotic behavior and worst-case scenario considerations or the upper bound that big O is striving for, we are only interested in the largest term so O(n^2).


Good luck on your midterms ;)

138
votes

For the case where n <= 0, T(n) = O(1). Therefore, the time complexity will depend on when n >= 0.

We will consider the case n >= 0 in the part below.

1.

T(n) = a + T(n - 1)

where a is some constant.

By induction:

T(n) = n * a + T(0) = n * a + b = O(n)

where a, b are some constant.

2.

T(n) = a + T(n - 5)

where a is some constant

By induction:

T(n) = ceil(n / 5) * a + T(k) = ceil(n / 5) * a + b = O(n)

where a, b are some constant and k <= 0

3.

T(n) = a + T(n / 5)

where a is some constant

By induction:

T(n) = a * log5(n) + T(0) = a * log5(n) + b = O(log n)

where a, b are some constant

4.

T(n) = a + 2 * T(n - 1)

where a is some constant

By induction:

T(n) = a + 2a + 4a + ... + 2^(n-1) * a + T(0) * 2^n 
     = a * 2^n - a + b * 2^n
     = (a + b) * 2^n - a
     = O(2 ^ n)

where a, b are some constant.

5.

T(n) = n / 2 + T(n - 5)

where n is some constant

Rewrite n = 5q + r where q and r are integer and r = 0, 1, 2, 3, 4

T(5q + r) = (5q + r) / 2 + T(5 * (q - 1) + r)

We have q = (n - r) / 5, and since r < 5, we can consider it a constant, so q = O(n)

By induction:

T(n) = T(5q + r)
     = (5q + r) / 2 + (5 * (q - 1) + r) / 2 + ... + r / 2 +  T(r)
     = 5 / 2 * (q + (q - 1) + ... + 1) +  1 / 2 * (q + 1) * r + T(r)
     = 5 / 4 * (q + 1) * q + 1 / 2 * (q + 1) * r + T(r)
     = 5 / 4 * q^2 + 5 / 4 * q + 1 / 2 * q * r + 1 / 2 * r + T(r)

Since r < 4, we can find some constant b so that b >= T(r)

T(n) = T(5q + r)
     = 5 / 2 * q^2 + (5 / 4 + 1 / 2 * r) * q + 1 / 2 * r + b
     = 5 / 2 * O(n ^ 2) + (5 / 4 + 1 / 2 * r) * O(n) + 1 / 2 * r + b
     = O(n ^ 2)
35
votes

One of the best ways I find for approximating the complexity of the recursive algorithm is drawing the recursion tree. Once you have the recursive tree:

Complexity = length of tree from root node to leaf node * number of leaf nodes
  1. The first function will have length of n and number of leaf node 1 so complexity will be n*1 = n
  2. The second function will have the length of n/5 and number of leaf nodes again 1 so complexity will be n/5 * 1 = n/5. It should be approximated to n

  3. For the third function, since n is being divided by 5 on every recursive call, length of recursive tree will be log(n)(base 5), and number of leaf nodes again 1 so complexity will be log(n)(base 5) * 1 = log(n)(base 5)

  4. For the fourth function since every node will have two child nodes, the number of leaf nodes will be equal to (2^n) and length of the recursive tree will be n so complexity will be (2^n) * n. But since n is insignificant in front of (2^n), it can be ignored and complexity can be only said to be (2^n).

  5. For the fifth function, there are two elements introducing the complexity. Complexity introduced by recursive nature of function and complexity introduced by for loop in each function. Doing the above calculation, the complexity introduced by recursive nature of function will be ~ n and complexity due to for loop n. Total complexity will be n*n.

Note: This is a quick and dirty way of calculating complexity(nothing official!). Would love to hear feedback on this. Thanks.

14
votes

We can prove it mathematically which is something I was missing in the above answers.

It can dramatically help you understand how to calculate any method. I recommend reading it from top to bottom to fully understand how to do it:

  1. T(n) = T(n-1) + 1 It means that the time it takes for the method to finish is equal to the same method but with n-1 which is T(n-1) and we now add + 1 because it's the time it takes for the general operations to be completed (except T(n-1)). Now, we are going to find T(n-1) as follow: T(n-1) = T(n-1-1) + 1. It looks like we can now form a function that can give us some sort of repetition so we can fully understand. We will place the right side of T(n-1) = ... instead of T(n-1) inside the method T(n) = ... which will give us: T(n) = T(n-1-1) + 1 + 1 which is T(n) = T(n-2) + 2 or in other words we need to find our missing k: T(n) = T(n-k) + k. The next step is to take n-k and claim that n-k = 1 because at the end of the recursion it will take exactly O(1) when n<=0. From this simple equation we now know that k = n - 1. Let's place k in our final method: T(n) = T(n-k) + k which will give us: T(n) = 1 + n - 1 which is exactly n or O(n).
  2. Is the same as 1. You can test it your self and see that you get O(n).
  3. T(n) = T(n/5) + 1 as before, the time for this method to finish equals to the time the same method but with n/5 which is why it is bounded to T(n/5). Let's find T(n/5) like in 1: T(n/5) = T(n/5/5) + 1 which is T(n/5) = T(n/5^2) + 1. Let's place T(n/5) inside T(n) for the final calculation: T(n) = T(n/5^k) + k. Again as before, n/5^k = 1 which is n = 5^k which is exactly as asking what in power of 5, will give us n, the answer is log5n = k (log of base 5). Let's place our findings in T(n) = T(n/5^k) + k as follow: T(n) = 1 + logn which is O(logn)
  4. T(n) = 2T(n-1) + 1 what we have here is basically the same as before but this time we are invoking the method recursively 2 times thus we multiple it by 2. Let's find T(n-1) = 2T(n-1-1) + 1 which is T(n-1) = 2T(n-2) + 1. Our next place as before, let's place our finding: T(n) = 2(2T(n-2)) + 1 + 1 which is T(n) = 2^2T(n-2) + 2 that gives us T(n) = 2^kT(n-k) + k. Let's find k by claiming that n-k = 1 which is k = n - 1. Let's place k as follow: T(n) = 2^(n-1) + n - 1 which is roughly O(2^n)
  5. T(n) = T(n-5) + n + 1 It's almost the same as 4 but now we add n because we have one for loop. Let's find T(n-5) = T(n-5-5) + n + 1 which is T(n-5) = T(n - 2*5) + n + 1. Let's place it: T(n) = T(n-2*5) + n + n + 1 + 1) which is T(n) = T(n-2*5) + 2n + 2) and for the k: T(n) = T(n-k*5) + kn + k) again: n-5k = 1 which is n = 5k + 1 that is roughly n = k. This will give us: T(n) = T(0) + n^2 + n which is roughly O(n^2).

I now recommend reading the rest of the answers which now, will give you a better perspective. Good luck winning those big O's :)

5
votes

The key here is to visualise the call tree. Once done that, the complexity is:

nodes of the call tree * complexity of other code in the function

the latter term can be computed the same way we do for a normal iterative function.

Instead, the total nodes of a complete tree are computed as

                  C^L - 1
                  -------  , when C>1
               /   C - 1
              /
 # of nodes =
              \    
               \ 
                  L        , when C=1

Where C is number of children of each node and L is the number of levels of the tree (root included).

It is easy to visualise the tree. Start from the first call (root node) then draw a number of children same as the number of recursive calls in the function. It is also useful to write the parameter passed to the sub-call as "value of the node".

So, in the examples above:

  1. the call tree here is C = 1, L = n+1. Complexity of the rest of function is O(1). Therefore total complexity is L * O(1) = (n+1) * O(1) = O(n)
n     level 1
n-1   level 2
n-2   level 3
n-3   level 4
... ~ n levels -> L = n
  1. call tree here is C = 1, L = n/5. Complexity of the rest of function is O(1). Therefore total complexity is L * O(1) = (n/5) * O(1) = O(n)
n
n-5
n-10
n-15
... ~ n/5 levels -> L = n/5
  1. call tree here is C = 1, L = log(n). Complexity of the rest of function is O(1). Therefore total complexity is L * O(1) = log5(n) * O(1) = O(log(n))
n
n/5
n/5^2
n/5^3
... ~ log5(n) levels -> L = log5(n)
  1. call tree here is C = 2, L = n. Complexity of the rest of function is O(1). This time we use the full formula for the number of nodes in the call tree because C > 1. Therefore total complexity is (C^L-1)/(C-1) * O(1) = (2^n - 1) * O(1) = O(2^n).
               n                   level 1
      n-1             n-1          level 2
  n-2     n-2     n-2     n-2      ...
n-3 n-3 n-3 n-3 n-3 n-3 n-3 n-3    ...     
              ...                ~ n levels -> L = n
  1. call tree here is C = 1, L = n/5. Complexity of the rest of function is O(n). Therefore total complexity is L * O(1) = (n/5) * O(n) = O(n^2)
n
n-5
n-10
n-15
... ~ n/5 levels -> L = n/5
1
votes

I see that for the accepted answer (recursivefn5), some folks are having issues with the explanation. so I'd try to clarify to the best of my knowledge.

  1. The for loop runs for n/2 times because at each iteration, we are increasing i (the counter) by a factor of 2. so say n = 10, the for loop will run 10/2 = 5 times i.e when i is 0,2,4,6 and 8 respectively.

  2. In the same regard, the recursive call is reduced by a factor of 5 for every time it is called i.e it runs for n/5 times. Again assume n = 10, the recursive call runs for 10/5 = 2 times i.e when n is 10 and 5 and then it hits the base case and terminates.

  3. Calculating the total run time, the for loop runs n/2 times for every time we call the recursive function. since the recursive fxn runs n/5 times (in 2 above),the for loop runs for (n/2) * (n/5) = (n^2)/10 times, which translates to an overall Big O runtime of O(n^2) - ignoring the constant (1/10)...