Directi Interview Question
Country: India
Hold on...how would you implement goodrandom()? Isn't that a function that returns 1 with 1/2 probability and 0 otherwise? So your answer to the question presupposes that you already have something that answers the question. If you had goodrandom(), there would be no reason to use anything else.
That doesn't make sense. Say biasedRandom() returns 0 with probability 0.9. Your method would immediately have a 0.9 prob of returning 0
good soln..
to clear the confusion..
he has made the probability of generating to two 1s and two 0s equal so he could use one for each
p =0.4
probability of 11 -> 0.4 *0.6
probability of 00 -> 0.6*0.4
10 - 0.4 *0.4
01 - 0.6 *0.6
call the BiasedRandom two times and return the XOR of it
4 possible
function(Random)
bool r1 = function(BiasedRandom)
bool r2 = function(BiasedRandom)
if(r1 XOR r2 = 1)
if(r1==0)
return 0;
else
return 1;
else
return function(Random);
Absolutely not. Consider p = .7, 1-p = .3. The probability that exactly one of these events will happen is .42, not .5
This seems to be correct.
00 and 11 are ruled out by XOR if condition.
Probability of getting 01 (to return 0) and 10 (to return 1) are same, therefore 0.5
Doesn't seem wrong to me. Seems logically equivalent to two other solutions on this page.
The insight here is that if some event has probability P, if we run that event twice, there's an equal chance that it happens the first time and doesn't happen the second time as there is that it happens the second time and doesn't happen the first time. Then there's some chance that it happens both or neither times, and in those cases, we just try again, drawing two new random numbers. It's worth noting that expected runtime is O(1) because each loop has a constant chance of yielding a value, namely 2*P*(1-P). The expected runtime is therefore 1/(2*P*(1-P)) loops. Note that the less biased the source, the faster this algorithm. At P = .99 or P = .01 we need over 50 tries, but at P = .5 we only need 2. At P = 1 or P = 0 this algorithm will never finish, which makes sense because generating random numbers requires entropy and at P=1 there is none.
public boolean fairCoin()
{
while (true)
{
boolean roll1 = biasedRandom(), roll2 = biasedRandom();
if (roll1 && !roll2) { return 1; }
if (!roll1 && roll2) { return 0; }
}
}
This is wrong in your soln, probability of getting 1 is p*(1-p).
Probability of getting 0 is (1-p*(1-p))*p*(1-p) --- you can get 0 only if first condition fails and probability of that happening is 1- p*(1-p)
It's not wrong at all. The probability of getting 1 *on the first loop* is p*(1-p), yes. The probability of getting 0 *on the first loop* is also p*(1-p). Brant's argument doesn't apply because I don't generate new random numbers between the two lines of code that have returns. Consider the following thought experiment: say I had a rand3() method that returned an integer uniformly at random between 0 and 2 inclusive. If I then wrote code like int x = rand3(); if (x== 0) { return 2; } if (x== 1) { return 4; } if (x == 2) { return 6; } would you then argue that 4 or 6 are any less likely to be returned than 2? Of course not, since the conditions are disjoint.
The solution is very simple. Let me illustrate with an example. Let p=3/7 (prob[0]) and (1-p)=4/7. Now I will have this function
boolean truelyRandom(double p){
//let us take p =3/7
int sum0=0, sum1=0;
// idea is i am calling biasedRandom in the reverse ratio , p(0) is 3/7 and p(1) = 4/7. //Hence I call biased random total of 7 times, in reverse ratio , 4/7 times of total calls for //sum0 and 3/7 calls for sum1. Now I compare sum0 and sum1 and if sum0 is greater I //return 0 else I return 1
for ( int i=1 to 4) {
sum0=sum0+biasedRandom();
}
for ( int i=1 to 3) {
sum1=sum1+biasedRandom();
}
if ( sum0 > sum1)
return 0;
else
return 1;
}
The solution is as follows
1. Create one method RevBiased which will return opposite
2. now create one function goodrandom() -> call biased and revbiased if they are equal return that value if not call goodRandom recusively.Probability of both 1 or both 0 is P(1-P) hence probabilty of 0 and 1 is 0.5.
- loveCoding June 22, 2012