Time & Space Complexity Analysis
To understand the time & space complexity analysis, let's first go back and do as our home minister would say, understand the chronology first.
What I mean is to recall/understand that in what order and how are the recursive calls made? This will help us understand the difference b/w time & space and maybe make us value time more.
Laying the foundation
Imagine an example where we have an array and we need to search for an element in the array, what we're basically doing is dividing the array indices in 2 equal halves (or almost equal halves if array length is odd), and making recursive calls to both left & right half to try to find the element.
bool findElement(int arr[], int l, int r, int target) {
if(l > r) // no elements in the range
return false;
// if only one element, simply check if that
// element is equal to the target
if(l == r)
return arr[l] == target;
int mid = (l + r)/2;
return findElement(arr, l, mid, target) or findElement(arr, mid + 1, r, target);
}
Now, let's say that we have an arr = [4, 1, 4, 10, 2, 8, 10] and the target is 10 and we make the function call to find it in the array, this is how the recursion tree will look like:
Diving into Space & Time
Understanding the difference
One fundamental difference that I'd like to point out before moving forward is that time, once gone, doesn't come back. But if you delete that large "video" that has been taking up 5 GB, the space does come back for you to be able to use it. #LifeLessons
Coming to recursion, what I mean to say is whatever time is used up by any of the recursive calls, it adds up when calculating the overall time complexity.
But when it comes to space, that's not always the case. Only the space taken by the recursive calls that are piled up in the system gets added up. For e.g. in the above case, the space taken by the recursive call [l=0, r=1] gets freed up as soon as it returns the appropriate answer to [l=0, r=3], so the space taken by [l=0, r=1] and [l=2 | r=3] doesn't get added up, because the recursive call [l=0 | r=1] had already freed up the required space, even before the recursive call [l=2 | r=3] was made.
Understanding the crux
If we think from first principle, we'll be able to identify that the overall time taken by a recursive function will simply be equal to the time taken by all the recursive calls combined.
However, in the case of space, over all the root-to-leaf paths, whichever takes most space in total will be what will help us identify the space complexity.
In layman terms, it is generally said that:
- Time Complexity = num_recursive_calls * num_operations_per_call
- Space Complexity = depth_of_recursion_tree * space_per_recursive_call
Having said that, it's not always easy to analyse things by this quick formula, because often the space/time taken by a recursive call is not constant and changes according to the input passed to the call.
We'll see examples of this soon while writing recursive functions for merge sort and quick sort in the coming articles.
Time Complexity for the above example
If we look carefully, we aren't really doing much in the recursive code in addition to making the left & right recursive calls. So essentially, we're taking O(1) time per recursive call. Now, if we calculate the number of recursive calls required to be made in the worst case, we'll get our overall time complexity.
To analyse the total number of recursive calls, think about the number of nodes in the last level of the recursion tree (in the worst case). Don't you think it can go upto n nodes, in the case where the target is nowhere to be found in the array?
Now, what about the second last level? Think carefully and you'll see that it can go upto n/2 nodes in the worst case.
Similarly, n/4 nodes for the third last level. I hope you get the drill by now. If we try to calculate the total number of terms, it will approximately be equal to:
num_recursive_calls = N + N/2 + N/4 + N/8 + ..... 2 + 1
Solving the above Geometric Progression and combining it with the formula that we learnt above will lead to a time complexity of O(N).
Space Complexity for the above example
As you must have realised while thinking about the time complexity, the depth of the recursion tree will be approximately O(LogN) [to the base 2], and the extra space taken per recursive call is constant only as we aren't creating any array or even any primitive data-typed variable for that matter.
Following the formula above, the space complexity will come out to be O(LogN).