Using RNDADDENTROPY to add entropy to /dev/random

荒凉一梦 提交于 2019-12-03 13:38:52

I am using a hardware RNG to stock my entropy pool. My struct is a static size and looks like this (my kernel has a slightly different random.h; just copy what you find in yours and increase the array size to whatever you want):

#define BUFSIZE 256
/* WARNING - this struct must match random.h's struct rand_pool_info */
typedef struct {
    int bit_count;               /* number of bits of entropy in data */
    int byte_count;              /* number of bytes of data in array */
    unsigned char buf[BUFSIZ];
} entropy_t;

Whatever you pass in buf will be hashed and will stir the entropy pool. If you are using /dev/urandom, it does not matter what you pass for bit_count because /dev/urandom ignores it equaling zero and just keeps on going.

What bit_count does is push the point out at which /dev/random will block and wait for something to add more entropy from a physical RNG source. Thus, it's okay to guesstimate on bit_count. If you guess low, the worst that will happen is that /dev/random will block sooner than it otherwise would have. If you guess high, /dev/random will operate like /dev/urandom for a little bit longer than it otherwise would have before it blocks.

You can guesstimate based on the "quality" of your entropy source. If it's low, like characters typed by humans, you can set it to 1 or 2 per byte. If it's high, like values read from a dedicated hardware RNG, you can set it to 8 bits per byte.

If your data is perfectly random, then I believe it would be appropriate for entropy_count to be the number of bits in the buffer you provide. However, many (most?) sources of randomness aren't perfect, and so it makes sense for the buffer size and amount of entropy to be kept as separate parameters.

buf being declared to be size zero is a standard C idiom. The deal is that when you actually allocate a rand_pool_info, you do malloc(sizeof(rand_pool_info) + size_of_desired_buf), and then you refer to the buffer using the buf member. Note: With some C compilers, you can declare buf[*] instead of buf[0] to be explicit that in reality buf is "stretchy".

The number of bytes you have in the buffer correlates to the entropy of the data but the entropy can not be calculated only from that data or its length.

Sure, if the data came from a good, unpredictable and equal-distributed hardware random noise generatr the entropy (in bits) is 8*size of the buffer (in bytes).

But if the bits are not equally distributed or are somehow predictable the entropy becomes less.

See https://en.wikipedia.org/wiki/Entropy_(information_theory)

I hope that helps.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!