Let's look at both the original and the modified function. In the original function, you have the following amount of work done:
- A constant amount of work for the base case check.
- O(n) work to count up the number.
- The work required for a recursive call to something half the size.
We can express this as a recurrence relation:
- T(1) = 1
- T(n) = n + T(n / 2)
Let's see what this looks like. We can start expanding this out by noting that
T(n) = n + T(n / 2)
= n + (n / 2 + T(n / 4)
= n + n/2 + T(n / 4)
= n + n/2 + (n / 4 + T(n / 8))
= n + n/2 + n/4 + T(n / 8)
We can start to see a pattern here. If we expand out the T(n / 2) bit k times, we get
T(n) = n + n/2 + n/4 + ... + n/2k + T(n/2k)
Eventually, this stops when n / 2k = 1. When this happens, we have
T(n) = n + n/2 + n/4 + n/8 + ... + 1
What does this evaluate to? Interestingly, this sum is equal to 2n + 1, because the sum n + n/2 + n/4 + n/8 + ... = 2n. Consequently, this first function is O(n). We could have also arrived at this conclusion by using the Master Theorem, but it's interesting to see this approach as well.
So now let's look at the new function. This one does
- A constant amount of work for the base case check.
- The work required for a recursive call to something half the size.
We can write this recurrence as
- T(1) = 1
- T(n) = 1 + T(n / 2)
Using the same trick as before, let's expand out T(n):
T(n) = 1 + T(n / 2)
= 1 + 1 + T(n / 4)
= 1 + 1 + 1 + T(n / 8)
More generally, after expanding out k times, we get
T(n) = k + T(n / 2k)
This stops when n / 2k = 1, which happens when k = log2 n (that is, lg n). In this case, we get that
T(n) = lg n + T(1) = lg n + 1
So T(n) = O(lg n) in this case.
Hope this helps!