Using random numbers with GPUs

后端 未结 7 870
花落未央
花落未央 2021-02-03 12:30

I\'m investigating using nvidia GPUs for Monte-Carlo simulations. However, I would like to use the gsl random number generators and also a parallel random number generator such

相关标签:
7条回答
  • 2021-02-03 12:40

    I've just found that NAG provide some RNG routines. These libraries are free for academics.

    0 讨论(0)
  • 2021-02-03 12:44

    You will have to implement them by yourself.

    0 讨论(0)
  • 2021-02-03 12:47

    The GSL manual recommends the Mersenne Twister.

    The Mersenne Twister authors have a version for Nvidia GPUs. I looked into porting this to the R package gputools but found that I needed excessively large number of draws (millions, I think) before the combination of 'generate of GPU and make available to R' was faster than just drawing in R (using only the CPU).

    It really is a computation / communication tradeoff.

    0 讨论(0)
  • 2021-02-03 12:52

    Massive parallel random generation as you need it for GPUs is a difficult problem. This is an active research topic. You really have to be careful not only to have a good sequential random generator (these you find in the literature) but something that guarantees that they are independent. Pairwise independence is not sufficient for a good Monte Carlo simulation. AFAIK there is no good public domain code available.

    0 讨论(0)
  • 2021-02-03 12:55

    Use the Mersenne Twister PRNG, as provided in the CUDA SDK.

    0 讨论(0)
  • 2021-02-03 12:58

    Here we use sobol sequences on the GPUs.

    0 讨论(0)
提交回复
热议问题