ghost89413
BAN USERBut you ignored the boundary... in fact.
Let use an example to show what's wrong with your approch.
For example T(4), use the method you mentioned
T(4) = T(3) + T(2) =
T(2) + T(1) + T(1) + T(0) // In this step, the number of T doubled of course
= T(1) + T(0) + T(1) + T(1) + T(0) // But in this step, you can not split T(1) and T(0) in two parts.... So the number of T is not doubled ... that is why your method failed
In addition, in big O notation, the running time is about O(1.618^n)
- ghost89413 November 05, 2013It's correct the program runs in exponential time. But in fact you ignore the fact that the recursion tree is not balanced.
If you computes f(n), the recursion tree will be f(n-1) on the left and f(n-2) on the right. However the height of this two subtree is not equal. And that is why your answer is wrong.
The correct approach is to define the running time of f(n) as T(n)
Then we have T(n) = T(n-1) + T(n-2)
For n < 2, T(n) = 1
So Flux's answer is more accurate than yours. In fact the running time is Lucas number.
The Lucas Number is also an exponential function. But it's much smaller than 2^n when n gets large
Yea, I agree your big O notation is right.
- ghost89413 November 06, 2013But we know the solution of recursion equation T(n) = T(n-1) + T(n-2)
is T(n) = \Theta(\phi^n)
And I think your big O notation is too loose.
In fact 2^n grows much much faster than \phi^n .... two functions are not in the same order of growth ....
Anyway, there's no problem to say T(n) = O(f(n)) for any f(n) growing faster than \phi^n.... But in algorithm analyzing what we want is a tight boundary ...
In addition, your analysis is doubtful... although your conclusion is correct mathematically .