LZ4 library decompressed data upper bound size estimation

前端 未结 2 602
夕颜
夕颜 2021-01-04 23:05

I\'m using LZ4 library and when decompressing data with

int LZ4_decompress_safe (const char* source, char* dest, int compressedSize, int maxDecompressedSize)         


        
相关标签:
2条回答
  • 2021-01-04 23:44

    The maximum compression ratio of LZ4 is 255, so a guaranteed over-estimation of decompressed data size is 255 times input size.

    That's obviously too much to be really useful, hence the reason why there is no "reverse LZ4_compressBound()" function available.

    I'm afraid there is no other way than to save, or know, the uncompressed size. The LZ4 "raw" compression format doesn't define a way to save such information, because optimal choice is application specific. For example, some application know in advance that no block can be > 16KB, so they can use maxDecompressedSize = 16 KB when calling LZ4_decompress_safe().

    Now, if you are looking for an envelope format that will take in charge such responsibility, you could either create your own custom one, or use the LZ4 Framing format : http://fastcompression.blogspot.fr/2013/04/lz4-streaming-format-final.html (also present as LZ4_Framing_Format.html within source package). Alas, the library able to generate and read this format is currently in beta stage (https://github.com/Cyan4973/lz4/tree/frame)

    0 讨论(0)
  • 2021-01-04 23:45

    Just for reference, n bytes of LZ4 compressed data can represent up to 24 + 255(n - 10) uncompressed bytes, which is the case of a run of that many bytes. n must be at least ten to make a valid stream that includes a literal, a match, and then five literals at the end per the specification. So the decompress bound function could be something like (n << 8) - n - 2526.

    The maximum compression ratio is then: 255 - 2526 / n, which asymptotically approaches 255 for arbitrarily large n.

    0 讨论(0)
提交回复
热议问题