Assuming we have an implementation of RANDOM(0.1), how can we
implement RANDOM(a.b) - i.e. given we have a function that returns 0
OR 1 both with a probability of 1/2, how can we implement a function
that returns integers from a to b (b a), all with a probability of 1/
n, where n = (b-a+1)
my soln:
1.we can run random(0,1) b-a timesand sum up all the values
generatedthat is what i thought...but smone told me that
it is not correct coz sum follows binomial distribution, thus don't
have equally like chance
2.on some site:
RANDOM(a,b)
range = b - a;
d = the number of binary bits %range has
do
for(i = 1 to d):
i th bit of integer res = rand(0,1);
while(res range);
return (a + res);
Each time of while loop need O(log(b-a)) time, and the expectation of
that while loop is constant. Therefore, it takes time O(log(b-
a))..how does he say that expectation of that while loop is
constantm weak with probability.plz enlighten me
--
You received this message because you are subscribed to the Google Groups
Algorithm Geeks group.
To post to this group, send email to algogeeks@googlegroups.com.
To unsubscribe from this group, send email to
algogeeks+unsubscr...@googlegroups.com.
For more options, visit this group at
http://groups.google.com/group/algogeeks?hl=en.