information-theory

Any theoretical limit to compression?

大憨熊 提交于 2019-11-30 11:25:26
Imagine that you had all the supercomputers in the world at your disposal for the next 10 years. Your task was to compress 10 full-length movies losslessly as much as possible. Another criteria was that a normal computer should be able to decompress it on the fly and should not need to spend much of his HD to install the decompressing software. My question is, how much more compression could you achieve than the best alternatives today? 1%, 5%, 50%? More specifically: is there a theoretical limit to compression, given a fixed dictionary size (if it is called that for video compression as well)

Any theoretical limit to compression?

拈花ヽ惹草 提交于 2019-11-29 17:33:55
问题 Imagine that you had all the supercomputers in the world at your disposal for the next 10 years. Your task was to compress 10 full-length movies losslessly as much as possible. Another criteria was that a normal computer should be able to decompress it on the fly and should not need to spend much of his HD to install the decompressing software. My question is, how much more compression could you achieve than the best alternatives today? 1%, 5%, 50%? More specifically: is there a theoretical

How do I compute the approximate entropy of a bit string?

不问归期 提交于 2019-11-28 04:20:23
Is there a standard way to do this? Googling -- "approximate entropy" bits -- uncovers multiple academic papers but I'd like to just find a chunk of pseudocode defining the approximate entropy for a given bit string of arbitrary length. (In case this is easier said than done and it depends on the application, my application involves 16,320 bits of encrypted data (cyphertext). But encrypted as a puzzle and not meant to be impossible to crack. I thought I'd first check the entropy but couldn't easily find a good definition of such. So it seemed like a question that ought to be on StackOverflow!

Optimal way to compute pairwise mutual information using numpy

…衆ロ難τιáo~ 提交于 2019-11-28 03:08:33
For an m x n matrix, what's the optimal (fastest) way to compute the mutual information for all pairs of columns ( n x n )? By mutual information , I mean: I(X, Y) = H(X) + H(Y) - H(X,Y) where H(X) refers to the Shannon entropy of X . Currently I'm using np.histogram2d and np.histogram to calculate the joint (X,Y) and individual (X or Y) counts. For a given matrix A (e.g. a 250000 X 1000 matrix of floats), I am doing a nested for loop, n = A.shape[1] for ix = arange(n) for jx = arange(ix+1,n): matMI[ix,jx]= calc_MI(A[:,ix],A[:,jx]) Surely there must be better/faster ways to do this? As an

How do I compute the approximate entropy of a bit string?

[亡魂溺海] 提交于 2019-11-27 00:20:48
问题 Is there a standard way to do this? Googling -- "approximate entropy" bits -- uncovers multiple academic papers but I'd like to just find a chunk of pseudocode defining the approximate entropy for a given bit string of arbitrary length. (In case this is easier said than done and it depends on the application, my application involves 16,320 bits of encrypted data (cyphertext). But encrypted as a puzzle and not meant to be impossible to crack. I thought I'd first check the entropy but couldn't

Mutual information and joint entropy of two images - MATLAB

六月ゝ 毕业季﹏ 提交于 2019-11-26 19:40:15
I have two black and white images and I need to calculate the mutual information. Image 1 = X Image 2 = Y I know that the mutual information can be defined as: MI = entropy(X) + entropy(Y) - JointEntropy(X,Y) MATLAB already has built-in functions to calculate the entropy but not to calculate the joint entropy. I guess the true question is: How do I calculate the joint entropy of two images? Here is an example of the images I'd like to find the joint entropy of: X = 0 0 0 0 0 0 0 0 1 1 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 Y = 0 0 0 0 0 0 0 0 0.38 0.82 0.38 0.04 0 0 0.32 0.82 0.68 0.17 0 0 0

Mutual information and joint entropy of two images - MATLAB

依然范特西╮ 提交于 2019-11-26 07:20:58
问题 I have two black and white images and I need to calculate the mutual information. Image 1 = X Image 2 = Y I know that the mutual information can be defined as: MI = entropy(X) + entropy(Y) - JointEntropy(X,Y) MATLAB already has built-in functions to calculate the entropy but not to calculate the joint entropy. I guess the true question is: How do I calculate the joint entropy of two images? Here is an example of the images I\'d like to find the joint entropy of: X = 0 0 0 0 0 0 0 0 1 1 0 0 0