I\'m trying to implement a weighted random numbers. I\'m currently just banging my head against the wall and cannot figure this out.
In my project (Hold\'em hand-ran
If your weights change more slowly than they are drawn, C++11 discrete_distribution
is going to be the easiest:
#include
#include
std::vector weights{90,56,4};
std::discrete_distribution dist(std::begin(weights), std::end(weights));
std::mt19937 gen;
gen.seed(time(0));//if you want different results from different runs
int N = 100000;
std::vector samples(N);
for(auto & i: samples)
i = dist(gen);
//do something with your samples...
Note, however, that the c++11 discrete_distribution
computes all of the cumulative sums on initialization. Usually, you want that because it speeds up the sampling time for a one time O(N) cost. But for a rapidly changing distribution it will incur a heavy calculation (and memory) cost. For instance if the weights represented how many items there are and every time you draw one, you remove it, you will probably want a custom algorithm.
Will's answer https://stackoverflow.com/a/1761646/837451 avoids this overhead but will be slower to draw from than the C++11 because it can't use binary search.
To see that it does this, you can see the relevant lines (/usr/include/c++/5/bits/random.tcc
on my Ubuntu 16.04 + GCC 5.3 install):
template
void
discrete_distribution<_IntType>::param_type::
_M_initialize()
{
if (_M_prob.size() < 2)
{
_M_prob.clear();
return;
}
const double __sum = std::accumulate(_M_prob.begin(),
_M_prob.end(), 0.0);
// Now normalize the probabilites.
__detail::__normalize(_M_prob.begin(), _M_prob.end(), _M_prob.begin(),
__sum);
// Accumulate partial sums.
_M_cp.reserve(_M_prob.size());
std::partial_sum(_M_prob.begin(), _M_prob.end(),
std::back_inserter(_M_cp));
// Make sure the last cumulative probability is one.
_M_cp[_M_cp.size() - 1] = 1.0;
}