entropy

Is it possible to generate random numbers using physical sensors?

好久不见. 提交于 2019-12-18 11:02:50
问题 I've heard about people using light sensors, geiger counters, and other physical sensors to generate random numbers, but I'm skeptical. Is there really a way to generate random numbers from taking measurements of the physical world (using an Arduino or any other microcontroller)? If so, would these numbers ever be really random? to clarify: the question is about the feasibility of using microcontroller-gathered data to generate random numbers that could be applied soundly to cryptography-- an

Is there a built-in KL divergence loss function in TensorFlow?

天大地大妈咪最大 提交于 2019-12-17 22:54:55
问题 I have two tensors, prob_a and prob_b with shape [None, 1000] , and I want to compute the KL divergence from prob_a to prob_b . Is there a built-in function for this in TensorFlow? I tried using tf.contrib.distributions.kl(prob_a, prob_b) , but it gives: NotImplementedError: No KL(dist_a || dist_b) registered for dist_a type Tensor and dist_b type Tensor If there is no built-in function, what would be a good workaround? 回答1: Assuming that your input tensors prob_a and prob_b are probability

Fastest way to compute entropy in Python

ⅰ亾dé卋堺 提交于 2019-12-17 10:19:35
问题 In my project I need to compute the entropy of 0-1 vectors many times. Here's my code: def entropy(labels): """ Computes entropy of 0-1 vector. """ n_labels = len(labels) if n_labels <= 1: return 0 counts = np.bincount(labels) probs = counts[np.nonzero(counts)] / n_labels n_classes = len(probs) if n_classes <= 1: return 0 return - np.sum(probs * np.log(probs)) / np.log(n_classes) Is there a faster way? 回答1: @Sanjeet Gupta answer is good but could be condensed. This question is specifically

Implementation of Theil inequality index in python

拈花ヽ惹草 提交于 2019-12-13 13:38:03
问题 I am trying to implement Theil's index (http://en.wikipedia.org/wiki/Theil_index) in Python to measure inequality of revenue in a list. The formula is basically Shannon's entropy, so it deals with log. My problem is that I have a few revenues at 0 in my list, and log(0) makes my formula unhappy. I believe adding a tiny float to 0 wouldn't work as log(tinyFloat) = -inf, and that would mess my index up. [EDIT] Here's a snippet (taken from another, much cleaner -and freely available-,

What entropy sources are available on Windows?

孤人 提交于 2019-12-13 13:15:35
问题 I want to produce a random cryptographic key on Windows. Where can I obtain entropy? I would like my entropy function to work without a network connection and to be reliable on Windows 2000 and upwards. Even sources which may or may not provide a small amount of entropy could be useful as all the sources will be pooled. This is my initial list of functions: GetCurrentProcessID, GetCurrentThreadID, GetTickCount, GetLocalTime, QueryPerformanceCounter, GlobalMemoryStatus, GetDiskFreeSpace,

Joint Entropy of audio files

筅森魡賤 提交于 2019-12-13 01:06:55
问题 So, after trying a heavy cicle-based function for calculating the joint entropy of two sources of information, I found this useful MATLAB function, accumarray , and tried the following code: function e = jointEntropy(fonte1, fonte2) i = double(fonte1(:))+ 1; j = double(fonte2(:)) + 1; subs = [i j]; f = accumarray(subs, ones(length(fonte1), 1)); p = f / length(fonte1); freq = f ~= 0; prob = p(freq); e = -sum(prob.*log2(prob)); end , where fonte1 and fonte2 are the information sources, 1xN

Getting linux to buffer /dev/random

喜夏-厌秋 提交于 2019-12-12 17:01:09
问题 I need a reasonable supply of high-quality random data for an application I'm writing. Linux provides the /dev/random file for this purpose which is ideal; however, because my server is a single-service virtual machine, it has very limited sources of entropy, meaning /dev/random quickly becomes exhausted. I've noticed that if I read from /dev/random, I will only get 16 or so random bytes before the device blocks while it waits for more entropy: [duke@poopz ~]# hexdump /dev/random 0000000 f4d3

How (if at all) does a predictable random number generator get more secure after SHA-1ing its output?

徘徊边缘 提交于 2019-12-12 08:29:22
问题 This article states that Despite the fact that the Mersenne Twister is an extremely good pseudo-random number generator, it is not cryptographically secure by itself for a very simple reason. It is possible to determine all future states of the generator from the state the generator has at any given time, and either 624 32-bit outputs, or 19,937 one-bit outputs are sufficient to provide that state. Using a cryptographically-secure hash function, such as SHA-1, on the output of the Mersenne

Weighted Decision Trees using Entropy

一个人想着一个人 提交于 2019-12-12 07:56:55
问题 I'm building a binary classification tree using mutual information gain as the splitting function. But since the training data is skewed toward a few classes, it is advisable to weight each training example by the inverse class frequency. How do I weight the training data? When calculating the probabilities to estimate the entropy, do I take weighted averages? EDIT: I'd like an expression for entropy with the weights. 回答1: State-value weighted entropy as a measure of investment risk. http:/

Matlab : Help in entropy estimation of a disretized time series

陌路散爱 提交于 2019-12-12 05:03:48
问题 This Question is in continuation to a previous one asked Matlab : Plot of entropy vs digitized code length I want to calculate the entropy of a random variable that is discretized version (0/1) of a continuous random variable x . The random variable denotes the state of a nonlinear dynamical system called as the Tent Map. Iterations of the Tent Map yields a time series of length N. The code should exit as soon as the entropy of the discretized time series becomes equal to the entropy of the