Microsoft Interview Question
Software Engineer / DevelopersTeam: Autopilot
Country: United States
Interview Type: Phone Interview
As ghost89413 noticed above below, there is recurrent formula for running time
T(n) = T(n-1) + T(n-2)
For n < 2, T(n) = 1
Which is Fib(n).
Another intuitive approach is to notice that we haven't any type of memoisation or result caching for functions call, so final result is sum of 1-s (if "flatten" recursion tree). Fib(n) consists of Fib(n) ones, so complexity is Fib(n).
the complexity of a naive recursive fibonacci is indeed 2^n.
T(n<=1) = O(1) // Understood
T(n) = T(n-1) + T(n-2) + O(1)
Assume T(n-1) = O(2n-1), therefore
T(n) = T(n-1) + T(n-2) + O(1) which is equal to
T(n) = O(2n-1) + O(2n-2) + O(1) = O(2^n)
Here is another approach ...
T(n) = T(n-1) + T(n-2) = T(n-2) + T(n-3) + T(n-3) + T(n-4) =
= T(n-3) + T(n-4) + T(n-4) + T(n-5) + T(n-4) + T(n-5) + T(n-5) + T(n-6) = ...
in each step you call T twice, thus will provide eventual asymptotic barrier of: T(n) = 2 * 2 * ... * 2 = 2^n
But you ignored the boundary... in fact.
Let use an example to show what's wrong with your approch.
For example T(4), use the method you mentioned
T(4) = T(3) + T(2) =
T(2) + T(1) + T(1) + T(0) // In this step, the number of T doubled of course
= T(1) + T(0) + T(1) + T(1) + T(0) // But in this step, you can not split T(1) and T(0) in two parts.... So the number of T is not doubled ... that is why your method failed
First of all we have boundary .. T(n< 2) = O(1) (ignore T(n<=1) = O(1)).
Second .. here i am calculating big O notation, not exactly number of iteration .
So if any case if we have one less or more step still we will consider it O(2^n).
Like in your test case it is O(2^n) -1 = 6 -1 = 5. Still its complexity will be O(2^n).
Yea, I agree your big O notation is right.
But we know the solution of recursion equation T(n) = T(n-1) + T(n-2)
is T(n) = \Theta(\phi^n)
And I think your big O notation is too loose.
In fact 2^n grows much much faster than \phi^n .... two functions are not in the same order of growth ....
Anyway, there's no problem to say T(n) = O(f(n)) for any f(n) growing faster than \phi^n.... But in algorithm analyzing what we want is a tight boundary ...
In addition, your analysis is doubtful... although your conclusion is correct mathematically .
Where is the proof for "t(n) = t(n-1) + t(n-2)" is a recurrence for the running time for the naïve fib implementation?
Let's start there. Just because the math formula looks like a recurrence doesn't necessarily mean we can change f to T and be done with it.
This is right answer.
@urik: Pretty simple answer.
if T(n) is the time taken, then T(n) = T(n-1) + T(n-2) , with T(1), T(2) some non-zero constants.
This gives T(n) = Theta(phi^n), phi being the golden ratio.
Ok I trust that u can prove it.
But whar u just said has some subtle significant holes..... Not important for cc.
@urik.
Instead of just stating that there are holes, please just state them. I am curious what subtle and significant holes you found and would like to improve my understanding, for, if the holes are significant, my statements could be wrong. Thanks.
Note: The statement that the runtime is Fibonacci itself is not accurate.
More accurate is Fibonacci like, i.e. satisfies same recurrences, but might have different exact values. Asymptotically they are the same: Theta(phi^n).
The proof is standard.
The recurrence T(n) = T(n-1) + T(n-2) satisfies T(n) = Ax^n + By^n where x and y are roots of x^2 = x + 1, A and B are constants depending on T(1) and T(2).
x = phi and y = -1/phi. Thus T(n) = Theta(x^n) (as the y^n term in negligible).
@urik:
To be completely accurate, the runtime recurrence is
T(n) = T(n-1) + T(n-2) + O(1)
The solution to this is still Theta(phi^n).
To see this
in the recurrence H(n) =H(n-1) +H(n-2) + C, substitute G(n) = H(n) + C, we get
G(n) - C = G(n-1) -C + G(n-2) -C + C i.e. G(n) = G(n-1) + G(n-2), and thus G(n) = Theta(phi^n) and so H(n) = Theta(phi^n).
This can be used to show that T(n) = O(phi^n) and Omega(phi^n).
I would say this hole is neither subtle nor significant.
more intutitively: adding a constant time overhead will not change the asymptotic complexity of a recursive method...
Haha. You just popped up in this thread (well other characters went quiet and now an Anonymous handle is trying to prove a point).
Good troll.
Building turtles on turtles to make something more accurate does not justify accuracy of initial reasoning. I'm stranded without comp. Access so for now I'll just respond to your very last statement just above...
"More intuitively... " What you mean here?
Assume n power of 2 for conv.
F(n) = F(n/2) for n>1
F(1) = C
=> ?
Vs.
F(n) = F(n/2) + C_0
F(1) = C
=> ?
HMMMMmmmmm seems intuition has failed. We can go back and changr "what we meant" a few times though, right?
@anon
Just read some of ur recent replies carefully and upvoted one (with the best iteration of mathematics... With G(n)"
Do note u are improving the reasoning not simplying "explaining things that were obvious at first"
And who has been mass downvoting ghost ajeet yoda anon etc. in this thread? A good debate is ongoing stop frucking downvoting for no reason.
@Brainless:
Yes, the O(1) might be relevant in some cases. Intuition != proof and was the reason I even provided one in the first place. Besides, the statement was more about the constant in O(1).
To apply the intuition and earlier statements with the O(1) term, try this unary version of fibonacci:
void fib(int n) {
if (n == 1) {printf("1"); return;}
if (n == 2) {printf("11"); return;}
fib(n-1); fib(n-2);
}
Anyway, I found it obvious, you didn't. Happens. I will probably be wrong more often, so be it.
Lol so wait... We end with you giving me advice?
Your fancy and "obvious" conjecture (which came after two mathematical symbol chugging replies by you) was easily proven wrong in my last reply. And I just picked the very last thing you said.
You gave up too easily man...
So my main points scattered throughout this thread were
1) Ajeet and yoda were correct and their upperbound is easiest to prove
2) Using the fib math recurrence (whether u keep base cases from math definition directly or use C or O(1) matters little... This is what u call "fibonacci-like" recurrence as one of the intermediate improvements of your argument) DOES NOT become the running time recurrence for the naïve math-translated to programming recursive alg.
(In fact... The "math recurrence _like_" runnning time recurrence only gives the running times for the base cases only under normal assumptions)
This SHOULD be obvious (not the bits u claim to be obvious).
Changing f to T (whether u also change base case to C or not is of little relevance here as base cases will be only costs) does not give tight nor upperbound on running time. Should be obvious.
So... In general it will give an omega lower bound on the total running time only. As it only gathers cost from base case invocation.
But in the case of fib. There recusion tree spreads out exponentially (you can see this in one of many ways) in a way that there are theta(phi^n) base case invocations itself.
So simply counting the runtimes of base case code gives us omega(phi^n) runtime for naivefib(n). This is good enough for cc and interviews.
(Incidentally, to count all costs you can prove that the recursion tree has at most O(num base case nodes) as ivan "kind of" did to prove bigO 2^n
He did it (or meant to) by showing that a balanced binary recursion tree has basically the same number of internal nodes as base case nodes).
(Warning: Everything above needs adjustment if base case and internal functions aren't both constant running time)
Nothing is as obvious as changing f to T.
Some typos above, incl:
*num-internal-nodes is O(numbasecasebodes)
Having no comp. Access suxs.
@BU.
There is no "change f to T" going on.
It is based on the single recursive calls to f(n-1) and f(n).
If we were computing the recurrence F(n) = 1000*F(n-1) + F(n-2) the runtime recurrence would still be T(n) = T(n-1) + T(n-2) + O(1).
Don't be such an idiot.
^^^^ Fact that you said that nonsense proves that you are responsible for many of the handles/names that posted earlier. Multiple personality syndrome?
Idiot, we're replying back and for to "Time complexity of Fibonacci function evaluation is Fibonacci function itself."
You keep talking about slightly varying, unrelated things to troll on. F*ck off.
@Urik. Fucking Idiot, that statement is correct. If F_n is nth fibonacci number, then Theta(phi^n) = Theta(F_n) and so that statement is completely accurate.
Go back to your textbook and try to understand something this time.
Your inability to get a better bound than O(2^n), while the tight bounds being obvious to many is just irking you, isn't it?
Deal with it, dodo.
btw, your attempts to save your face were hilarious. Perhaps you can have a career in comedy (it is quite clear computer science/mathematics is not for you).
I don't understand Urik's/Trollcity's objections, TBH. The proof given by anonymous looks perfect to me.
What fallacy do you see Urik/Trollcity? Can you please explain?
Anonymous have no objections to his own final final version of proof. The final proving look good except to say it is obvious some steps. The obvious part proven wrong with example.
Already upvoted by me now. Stop to talk to yourself over and over. I know it is one person always talking to himself in most the postings these days.
-Guest DS
Algos. Utterly ignorantos. Fuck off. You are disturbing my conversation with myself.
Hehe.
The proof of Anonymous beat the urik , ajeet, flux , ghost shit proofs . Good proof Anon.
As mentioned earlier, the complexity is exponential. The reason is that the implementation is memory less, so you keep evaluating the same terms over and over.
Let's assume cost of calculation for comparison and addition is "1" unit.
Then if T(n) is the computation cost, we have
T(n) = T(n - 1) + T(n - 2) + 2 = 2 T(n - 2) + T(n - 3) + 4 > 2 * T(n - 2) > 4 * T(n - 4) > 8 T(n - 6) > ... > 2^k T(n - 2 * k) = theta(2^{n / 2}).
Look at the function and you can see why this is right. For each number you are recursively calling the function 2 times more.
So the first call to the function would be turning that one call into 2 additional calls, then those 2 calls would spawn another 2 calls.
1-> 2 -> 4 -> 16
So as you can see from his answer if you put in 4 it would give you 16 calls. sqrRoot(16) = 4. Therefor, as stated above, O(2^n).
It's correct the program runs in exponential time. But in fact you ignore the fact that the recursion tree is not balanced.
If you computes f(n), the recursion tree will be f(n-1) on the left and f(n-2) on the right. However the height of this two subtree is not equal. And that is why your answer is wrong.
The correct approach is to define the running time of f(n) as T(n)
Then we have T(n) = T(n-1) + T(n-2)
For n < 2, T(n) = 1
So Flux's answer is more accurate than yours. In fact the running time is Lucas number.
The Lucas Number is also an exponential function. But it's much smaller than 2^n when n gets large
Number of function calls for the n-th fibonacci number is
1+2+2^2+2^3+...+2^(n-2) which is actually 2^(n-1) - 1
the asymptotic behavior of this expression when n -> infinity
is O(2^n) obviously
Two assumptions implicit in ur proof.
Ur proving the growth of fib. Sequence Itself.
2nd u proved an upperbound of said fib.
But the upperbound is fairly loose so it shud be a good upperbound for naïve fib implementation itself.
I have not seen a proof that the fib programmed function (naïve recursion) is bigO phi^n , in any other replies.
It wud be quite a neat proof if someone can provide it.
Actually... Ur proof with some clean-up will prove 2^n upperbound on fib rec. Programming function directly (or u can use it to prove that upper bound on fib seq. And modify to cover the rec. Programmed function).
So this reiterates the earlier fellows correct answer of bigO 2^n
One can prove lower bound phi^n also
Still open is a tight theta bound.
No proof in this thread of bigO phi^n yet.
Time complexity of Fibonacci function evaluation is Fibonacci function itself.
- Flux November 05, 2013