Is Random function (in any programming language) biased?

后端 未结 2 1954
陌清茗
陌清茗 2021-01-27 19:27

Is the function Random in programming languages not biased? After all, the algorithms need to be based on something, and that can generate bias. According to this website https:

相关标签:
2条回答
  • 2021-01-27 19:53

    By definition, functions map a given input to a given output. For a pseudorandom generator, that means it maps a given "seed" to a given sequence of random-looking numbers. For such a generator to even begin to generate "random" numbers, the seed has to have some randomness itself. And there are many sources of the seed for this purpose, including—

    • high resolution timestamps,
    • timings of input devices,
    • thermal noise,
    • atmospheric noise, and
    • combinations of two or more of the sources above.

    Also, in general, the longer the seed is, the greater the variety of "random" sequences a pseudorandom generator can produce.


    Different pseudorandom number generators (PRNGs) have different qualities. If a particular PRNG is itself "bad", no seed selection strategy can make it "better". The choice of random number generator (RNG) will depend on what kind of application will use the random numbers, and you didn't really specify what kind of application you have in mind:

    • If the random numbers are intended to further information security in any way (e.g., they serve as random encryption keys, nonces, or passwords), then only a cryptographic RNG will do.
    • If the "random" numbers have to be reproducible at a later time, then a seeded PRNG of high quality has to be used. There are several good and bad choices for such a PRNG. In this sense, the rand function in C uses an unspecified algorithm, which hampers the goal of reproducible "randomness".
    0 讨论(0)
  • 2021-01-27 20:01

    Historically, the pseudo-random number functions in most programming languages have been bad. Old algorithms running on deterministic machines produced less than perfect results.

    But things are changing. All modern microprocessors have hardware-based entropy generation functions, and modern applications like online banking have driven the development of better algorithms. It totally depends on the OS, language, and library you have in mind. There are very good options, but you have to know what they are, because the bad options are still around.

    Something like C language's rand() is probably the worst. Getting bytes from Linux (or MacOS) /dev/random is very good. Cryptography libraries have good algorithms. It also depends on the application--for cryptography, you need very good quality random numbers. For something like Monte Carlo integration, you need lots of numbers quickly but not necessary perfect entropy--something like a PRNG seeded by /dev/random would be just fine.

    0 讨论(0)
提交回复
热议问题