Skip to main content

Recursion Basics

Time Complexity Analysis using Recurrence Relations

As I already stated here, the formulas shared don't always work, because, many a times, the time/space used varies from one recursive call to another, based on the input.

But, to our rescue, comes another way i.e. using recurrence relations to analyse time/space complexity for recursive functions.

Using Recurrence Relations

Continuing the example from the previous article, let's say we have a recursive function that finds pow(x, n). You can refer to the below code for revision:

Code
double myPowHelper(double x, int n) {
    if(!n)
        return 1.0;
    
	double partialAns = myPowHelper(x, n/2);

	// Squaring the partialAns
	partialAns *= partialAns;

    return n % 2? partialAns * x : partialAns;
}

One thing to notice is that although there are 2 parameters in the function i.e. x & n, the time and space complexity will be dependent on the n value only, because that parameter is the one that is driving the recursive calls, x is constant in each recursive call.

Finding out the time complexity

Now, we initially don't know what the overall time taken will be, what we know is that it'll be some function of N. So, let's assume that the time complexity is T(N).

T(N) could be O(N), O(N2), O(NLogN), O(LogN) or whatever, we don't know as of now. We've just given a name to it i.e. T(N).

Now, if we notice the code written in the myPowHelper(x, n) function, it's basically doing a few O(1) operations, and then making a recursive call to myPowHelper(x, n/2), right?

Can I say that T(N) = T(N/2) + O(1)? Because:

  1. The overall time taken will simply be the overall time taken by the recursive callmyPowHelper(x, n/2) plus some more constant time.
  2. If time taken by myPowHelper(x, n) is T(N), then time taken by myPowHelper(x, n/2) can be called as T(N/2).

Now, our target is to find the value of T(N), but this bl***y T(N/2) is coming in the equation. No problem, let's try to substitute the value of T(N/2)? The idea is that if T(N) = T(N/2) + O(1), by the same logic, the value of T(N/2) can be found as T(N/2) = T(N/4) + O(1), and then T(N/4) = T(N/8) + O(1) and so on and so forth.

Basically:

T(N) = T(N/2) + O(1)
T(N/2) = T(N/4) + O(1)
T(N/4) = T(N/8) + O(1)
......
......
T(1) = T(0) + O(1)

If we add up all the equations written above, then almost all the variable terms like T(N/2), T(N/4), T(N/8) etc. will cancel out, all we'll be left with is:

T(N) = T(0) + O(1) + O(1) + O(1) + ..... + O(1)

Now, 2 things here:

  1. T(0) will be O(1) only, as it is the base case.
  2. The number of times O(1) is going to come will be O(Log2N). (Think why?)

Summing both of the things above, T(N) = O(LogN).

Try the following

  1. T(N) = T(N - 1) + O(1)
  2. T(N) = 2*T(N/2) + O(1) [good one]

Answers

  1. T(N) = O(N2)
  2. T(N) = O(N)

Conclusion

With the jugaad (substitution method) shown above, the time/space complexity can be found.

So, with this article, we come to an end for Recursion Basics. We'll dive deeper in the later parts, but for now, we'll start with sorting techniques.