Comparison between lz4 vs lz4_hc vs blosc vs snappy vs fastlz

后端 未结 4 1524
粉色の甜心
粉色の甜心 2021-02-01 21:13

I have a large file of size 500 mb to compress in a minute with the best possible compression ratio. I have found out these algorithms to be suitable for my use.

相关标签:
4条回答
  • 2021-02-01 21:57

    This migth help you: (lz4 vs snappy) http://java-performance.info/performance-general-compression/ (benchmarks for lz4, snappy, lz4hc, blosc) https://web.archive.org/web/20170706065303/http://blosc.org:80/synthetic-benchmarks.html (now not available on http://www.blosc.org/synthetic-benchmarks.html)

    0 讨论(0)
  • 2021-02-01 21:59

    Like most questions, the answer usually ends up being: It depends :)

    The other answers gave you good pointers, but another thing to take into account is RAM usage in both compression and decompression stages, as well as decompression speed in MB/s.

    Decompression speed is typically inversely proportional to the compression ratio, so you may think you chose the perfect algorithm to save some bandwidth/disk storage, but then whatever is consuming that data downstream now has to spend much more time, CPU cycles and/or RAM to decompress. And RAM usage might seem inconsequential, but maybe the downstream system is an embedded/low-voltage system? Maybe RAM is plentiful, but CPU is limited? All those things need to be taken into account.

    Here's an example of a suite of benchmarks done on various algorithms, taking a lot of these considerations into account:

    https://catchchallenger.first-world.info/wiki/Quick_Benchmark:_Gzip_vs_Bzip2_vs_LZMA_vs_XZ_vs_LZ4_vs_LZO

    0 讨论(0)
  • 2021-02-01 22:04

    If you are only aiming for high compression density, you want to look at LZMA and large-window Brotli. These two algorithms give the best compression density from the widely available open-sourced algorithms. Brotli is slower at compression, but ~5x faster at decompression.

    0 讨论(0)
  • 2021-02-01 22:08

    Yann Collet's lz4, hands down.

    0 讨论(0)
提交回复
热议问题