The first function:
def f1(n): if n == 1: return 1 return f(f(n-1))
The second function:
def f2(n): if n == 1: return 1 return 1 + f(f(n-1))
Now I can see why both of the function’s space complexity is O(n)
since the recursion depth is equal to n
.
But about time complexity, I’m not being able to calculate it like I used to do with the equation for normal recursion, lets say instead of f(f(n-1))
we had f(n-1)
in the first function. then it would be T(n) = 1 + T(n-1) = 2 + T(n-2)=... = n
so O(n)
, I can intuitively understand that it might stay the same for f1
, since all the returns are 1
so I might have 2n
iterations which is O(n)
but I have no idea how to deal it formally.
For f2
I couldn’t know how to reach the time complexity, intuition failed me here, and I would really appreciate any help in how to analyze these recursive calls.
The Final answers are: f1: Time Complexity: O(n), Space Complexity: O(n)
.
f2: Time complexity: O(2^n), Space complexity: O(n).
Advertisement
Answer
As I think you realize, in f1
you’re performing two operations:
- Call
f1(n - 1)
- Call
f1
again with the result from the first operation.
You can see that calling f1
with anything… or at least anything greater than zero… returns 1.
In step 2, you’re just calling f1
again with 1, when returns immediately. O(N).
On to f2
. Let’s examine the behavior of this function.
- When n = 1, we return
1
. - When n = 2, we return
1 + f2(1)
. We know thatf2(1)
returns 1, sof2(2)
returns 2. - When n = 3, we return
1 + f2(2)
, so 3. - Etc.
So it looks like f2(n)
just returns n.
Every time we call f2
we are doing the same things we were doing in f1
: call self with n - 1
, then call self again with whatever that number is.
In other words, we’re calling f2
twice with n - 1
. For every n we double the number of calls, so O(2^N).