Recursion: Time & Space Complexity Analysis - 1
To understand the time & space complexity analysis, let's first go back and do as our home minister would say, understand the chronology first.
What I mean is to recall/understand in what order and how are the recursive calls made. This will help us understand the difference b/w time & space and maybe make us value time more.
Laying the foundation
Imagine an example where we have an array and we need to search for an element in the array, what we're basically doing is dividing the array indices into 2 equal halves (or almost equal halves if the array length is odd), and making recursive calls to both left & right half to try to find the element.
bool findElement(int arr[], int l, int r, int target) {
if(l > r) // no elements in the range
return false;
// if only one element, simply check if that
// element is equal to the target
if(l == r)
return arr[l] == target;
int mid = (l + r)/2;
return findElement(arr, l, mid, target) or findElement(arr, mid + 1, r, target);
}
Now, let's say that we have an arr = [4, 1, 4, 10, 2, 8, 10] and the target is 10 and we make the function call to find it in the array, this is how recursion tree will look like:
Diving into Space & Time
Understanding the difference
One fundamental difference that I'd like to point out before moving forward is that time, once gone, doesn't come back. But if you delete that large "video" that has been taking up 5 GB, the space does come back for you to be able to use it. #LifeLessons
Coming to recursion, what I mean to say is whatever time is used up by any of the recursive calls, adds up when calculating the overall time complexity.
But when it comes to space, that's not always the case. Only the space taken by the recursive calls that are piled up in the system gets added up. E.g. in the above case, the space taken by the recursive call [l=0, r=1] gets freed up as soon as it returns the appropriate answer to [l=0, r=3], so the space taken by [l=0, r=1] and [l=2 | r=3] doesn't get added up, because the recursive call [l=0 | r=1] had already freed up the required space, even before the recursive call [l=2 | r=3] was made.
Understanding the crux
If we think from the first principle, we'll be able to identify that the overall time taken by a recursive function will simply be equal to the time taken by all the recursive calls combined.
However, in the case of space, over all the root-to-leaf paths, whichever takes the most space in total will be what will help us identify the space complexity.
In layman's terms, it is generally said that:
- Time Complexity = num_recursive_calls * num_operations_per_call
- Space Complexity = depth_of_recursion_tree * space_per_recursive_call
Having said that, it's not always easy to analyze things by this quick formula, because often the space/time taken by a recursive call is not constant and changes according to the input passed to the call.
We'll see examples of this soon while writing recursive functions for merge sort and quick sort in the coming articles.
Time Complexity for the above example
If we look carefully, we aren't doing much in the recursive code in addition to making the left & right recursive calls. So essentially, we're taking O(1) time per recursive call. Now, if we calculate the number of recursive calls required to be made in the worst case, we'll get our overall time complexity.
To analyze the total number of recursive calls, think about the number of nodes in the last level of the recursion tree (in the worst case). Don't you think it can go up to n nodes, in the case where the target is nowhere to be found in the array?
Now, what about the second last level? Think carefully and you'll see that it can goup too n/2 nodes in the worst case.
Similarly, n/4 nodes for the third last level. I hope you get the drill by now. If we try to calculate the total number of terms, it will approximately be equal to:
num_recursive_calls = N + N/2 + N/4 + N/8 + ..... 2 + 1
Solving the above Geometric Progression and combining it with the formula that we learned above will lead to a time complexity of O(N).
Space Complexity for the above example
As you must have realized while thinking about the time complexity, the depth of the recursion tree will be approximately O(LogN) [to the base 2], and the extra space taken per recursive call is constant only as we aren't creating any array or even any primitive data-typed variable for that matter.
Following the formula above, the space complexity will come out to be O(LogN).
Related Problems
Here are some more problems related to this -
- Getting a hang of Recursion
- Introduction to Recursion - Understand Recursion by ExamplesUsing
- Most Asked Recursion Practice Problems with Solutionsefficiently
- Fast Exponentiation Algorithm: Efficient Power Calculation Method
- Power of Three: Understanding and Solving the Problem Efficiently
- Recursion: Time & Space Complexity Analysis - 2
- Time Complexity Analysis Using Recurrence Relations