why does this simple shuffle algorithm produce biased results? what is a simple reason?

前端 未结 12 1441
旧时难觅i
旧时难觅i 2020-11-27 03:17

it seems that this simple shuffle algorithm will produce biased results:

# suppose $arr is filled with 1 to 52

for ($i < 0; $i < 52; $i++) { 
  $j = r         


        
相关标签:
12条回答
  • 2020-11-27 03:24

    an illustrative approach might be this:

    1) consider only 3 cards.

    2) for the algorithm to give evenly distributed results, the chance of "1" ending up as a[0] must be 1/3, and the chance of "2" ending up in a[1] must be 1/3 too, and so forth.

    3) so if we look at the second algorithm:

    probability that "1" ends up at a[0]: when 0 is the random number generated, so 1 case out of (0,1,2), therefore, is 1 out of 3 = 1/3

    probability that "2" ends up at a[1]: when it didn't get swapped to a[0] the first time, and it didn't get swapped to a[2] the second time: 2/3 * 1/2 = 1/3

    probability that "3" ends up at a[2]: when it didn't get swapped to a[0] the first time, and it didn't get swapped to a[1] the second time: 2/3 * 1/2 = 1/3

    they are all perfectly 1/3, and we don't see any error here.

    4) if we try to calculate the probability of of "1" ending up as a[0] in the first algorithm, the calculation will be a bit long, but as the illustration in lassevk's answer shows, it is 9/27 = 1/3, but "2" ending up as a[1] has a chance of 8/27, and "3" ending up as a[2] has a chance of 9/27 = 1/3.

    as a result, "2" ending up as a[1] is not 1/3 and therefore the algorithm will produce pretty skewed result (about 3.7% error, as opposed to any negligible case such as 3/10000000000000 = 0.00000000003%)

    5) the proof that Joel Coehoorn has, actually can prove that some cases will be over-represented. I think the explanation that why it is n^n is this: at each iteration, there are n possibility that the random number can be, so after n iterations, there can be n^n cases = 27. This number doesn't divid the number of permuations (n! = 3! = 6) evenly in the case of n = 3, so some results are over-represented. they are over-represented in a way that instead of showing up 4 times, it shows up 5 times, so if you shuffle the cards millions of times from the initial order of 1 to 52, the over-represented case will show up 5 million times as opposed to 4 million times, which is quite big a difference.

    6) i think the over-representation is shown, but "why" will the over-representation happen?

    7) an ultimate test for the algorithm to be correct is that any number has a 1/n probability to end up at any slot.

    0 讨论(0)
  • 2020-11-27 03:26

    The best explanation I've seen for this effect was from Jeff Atwood on his CodingHorror blog (The Danger of Naïveté).

    Using this code to simulate a 3-card random shuffle...

    for (int i = 0; i < cards.Length; i++)
    {
        int n = rand.Next(cards.Length);
        Swap(ref cards[i], ref cards[n]);
    }
    

    ...you get this distribution.

    Distribution of 3-card shuffle

    The shuffle code (above) results in 3^3 (27) possible deck combinations. But the mathematics tell us that there are really only 3! or 6 possible combinations of a 3 card deck. So some of the combinations are over-represented.

    You would need to use a Fisher-Yates shuffle to properly (randomly) shuffle a deck of cards.

    0 讨论(0)
  • 2020-11-27 03:26

    The simple answer is that there are 52^52 possible ways for this algorithm to run, but there are only 52! possible arrangements of 52 cards. For the algorithm to be fair, it needs to produce each of these arrangements equally likely. 52^52 is not an integer multiple of 52!. Therefore, some arrangements must be more likely than others.

    0 讨论(0)
  • 2020-11-27 03:27

    Here's another intuition: the single shuffle swap can't create symmetry in the probability of occupying a position unless at least 2-way symmetry already exists. Call the three positions A, B, and C. Now let a be the probability of card 2 being in position A, b be the probability of card 2 being in position B, and c be the probability of it being in position C, prior to a swap move. Assume that no two probabilities are the same: a!=b, b!=c, c!=a. Now compute the probabilities a', b', and c' of the card being in these three positions following a swap. Let's say that this swap move consists of position C being swapped with one of the three positions at random. Then:

    a' = a*2/3 + c*1/3
    b' = b*2/3 + c*1/3
    c' = 1/3.
    

    That is, the probability that the card winds up in position A is the probability it was already there times the 2/3 of the time position A isn't involved in the swap, plus the probability that it was in position C times the 1/3 probability that C swapped with A, etc. Now subtracting the first two equations, we get:

    a' - b' = (a - b)*2/3
    

    which means that because we assumed a!=b, then a'!=b' (though the difference will approach 0 over time, given enough swaps). But since a'+b'+c'=1, if a'!=b', then neither can be equal to c' either, which is 1/3. So if the three probabilities start off all different before a swap, they will also all be different after a swap. And this would hold no matter which position was swapped - we just interchange the roles of the variables in the above.

    Now the very first swap started by swapping card 1 in position A with one of the others. In this case, there was two way symmetry before the swap, because the probability of card 1 in position B = probability of card 1 in position C = 0. So in fact, card 1 can wind up with symmetric probabilities and it does end up in each of the three positions with equal probability. This remains true for all subsequent swaps. But card 2 winds up in the three positions after the first swap with probability (1/3, 2/3, 0), and likewise card 3 winds up in the three positions with probability (1/3, 0, 2/3). So no matter how many subsequent swaps we do, we will never wind up with card 2 or 3 having exactly the same probability of occupying all three positions.

    0 讨论(0)
  • 2020-11-27 03:27

    The clearest answer to show the first algorithm fails is to view the algorithm in question as a Markov chain of n steps on the graph of n! vertices of all the permutation of n natural numbers. The algorithm hops from one vertex to another with a transition probability. The first algorithm gives the transition probability of 1/n for each hop. There are n^n paths the probability of each of which is 1/n^n. Suppose the final probability of landing on each vertex is 1/n! which is a reduced fraction. To achieve that there must be m paths with the same final vertex such that m/n^n=1/n! or n^n = mn! for some natural number m, or that n^n is divisible by n!. But that is impossible. Otherwise, n has to be divisible by n-1 which is only possible when n=2. We have contradiction.

    0 讨论(0)
  • 2020-11-27 03:29

    From your comments on the other answers, it seems that you are looking not just for an explanation of why the distribution is not the uniform distribution (for which the divisibility answer is a simple one) but also an "intuitive" explanation of why it is actually far from uniform.

    Here's one way of looking at it. Suppose you start with the initial array [1, 2, ..., n] (where n might be 3, or 52, or whatever) and apply one of the two algorithms. If all permutations are uniformly likely, then the probability that 1 remains in the first position should be 1/n. And indeed, in the second (correct) algorithm, it is 1/n, as 1 stays in its place if and only if it is not swapped the first time, i.e. iff the initial call to rand(0,n-1) returns 0.
    However, in the first (wrong) algorithm, 1 remains untouched only if it is neither swapped the first time nor any other time — i.e., only if the first rand returns 0 and none of the other rands returns 0, the probability of which is (1/n) * (1-1/n)^(n-1) ≈ 1/(ne) ≈ 0.37/n, not 1/n.

    And that's the "intuitive" explanation: in your first algorithm, earlier items are much more likely to be swapped out of place than later items, so the permutations you get are skewed towards patterns in which the early items are not in their original places.

    (It's a bit more subtle than that, e.g. 1 can get swapped into a later position and still end up getting swapped back into place through a complicated series of swaps, but those probabilities are relatively less significant.)

    0 讨论(0)
提交回复
热议问题