Interview Question
Country: India
Agreed... and if you make the input large enough it should reveal the asymptotic complexity despite shared resources.
And if really really interested, then just put it on an individual test box and make sure nothing else is running.
So for recursive algorithms (such as the binary search) you could implement the Master Method (Master Theorem).
T(n) = aT(n/b) + O(n^d)
The important items in this theorem state that the constants a, b, and d will help determine the 3 types of running time magnitude.
Case 1: a = b^d runtime: O(n^d log n) *logarithmic*
Case 2: a < b^d runtime: O(n^d) *quadratic*
Case 3: a > b^d runtime: O(n^log-b-a) (that is log b of a) *linearithmic*
Constant 'a' is the number of recursive calls in the function. For binary search there is a single recursive call, so this is a 1.
Constant 'b' is the rate at which the recursive call reduces the main problem into sub-problems. In binary search, it splits the array in half, so this would be a 2.
Constant 'd' is the running time of the base case of the recursive function. For binary search this is 0, as the base case is simply a single index check.
So for the specific case of binary search, you can see 1 = 2^0 which is 1=1. This puts us in use case 1. That means binary search runs in O(n^0 log n) which is simplified to O(log n).
Note that the master theorem only works for recursive functions. You can see the profound impact of performance improvements when analyzing the efficiency of algorithms such as 'Strassen's Matrix Multiplication Algorithm' (O(n^2.81)) vs. the standard matrix multiplication algorithm (O(n^3)) method. The gains of getting sub-cubic are substantial, particularly in matrix intensive applications.
I can't think of a better way of measuring the running time, but run the program multiple times. You need to pick a big N and then also measure for 2N, don't bother with 1 or N^2. You run the test many number of times for both N and 2N and then pick the best case running time. If you run the program plenty enough, you'll see the best running times more and more during benchmark. Then you can verify the complexity by comparing best running time of N and 2N, e.g.,
- oOZz July 09, 2013if T(N) ~= T(2N) => O(1)
if T(N) ~= 2T(2N) => O(n)
The other alternatives I can think of:
1. Look at the source, if it is an open source library.
2. For binary versions, profilers may help, if the library is build with with profile/debug flags enabled
3. You can generate source code from binary (byte-code) in Java. Then analyze the source.