Any theoretical limit to compression?

拈花ヽ惹草 提交于 2019-11-29 17:33:55

问题


Imagine that you had all the supercomputers in the world at your disposal for the next 10 years. Your task was to compress 10 full-length movies losslessly as much as possible. Another criteria was that a normal computer should be able to decompress it on the fly and should not need to spend much of his HD to install the decompressing software.

My question is, how much more compression could you achieve than the best alternatives today? 1%, 5%, 50%? More specifically: is there a theoretical limit to compression, given a fixed dictionary size (if it is called that for video compression as well)?


回答1:


The limits of compression are dictated by the randomness of the source. Welcome to the study of information theory! See data compression.




回答2:


There is a theoretical limit: I suggest reading this article on Information theory and the pigeon hole principle. It seems to sum up the issue in a very easy to understand way.




回答3:


If you have a fixed catalogue of all the movies you were ever going to compress, you could just send an id for the movie and have the "decompression" lookup up the data with that index. So compression could be to a fixed size of log2(N) bits, where N was the number of movies.

I suspect the practical lower bound is rather higher than this.

Do you really mean lossless? Most of today's video compression is lossy, I thought.



来源:https://stackoverflow.com/questions/4340610/any-theoretical-limit-to-compression

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!