问题
I have a large file of data I have compressed with Zlib using boost IOStreams and filtering stream buffers:
boost::iostreams::array_source uncompressedArray( reinterpret_cast< const char* >( &uncompressedData[0] ), uncompressedData.size() );
boost::iostreams::filtering_streambuf< boost::iostreams::output > out;
out.push( *m_compressor );
out.push( boost::iostreams::char_back_inserter( compressedData ) );
boost::iostreams::copy( uncompressedArray, out );
For speed I am initializing the zlib library with the following:
boost::iostreams::zlib_params params;
params.level = boost::iostreams__zlib::best_speed;
params.mem_level = 9;
m_compressor.reset( new boost::iostreams::zlib_compressor( params, 131072 ) );
m_decompressor.reset( new boost::iostreams::zlib_decompressor( params, 131072 ) );
My decompressor looks like this:
boost::iostreams::array_source compressedArray( reinterpret_cast< const char* >( &compressedData[0] ), compressedData.size() );
boost::iostreams::filtering_streambuf< boost::iostreams::input > m_in;
m_in.push( *m_decompressor );
m_in.push( compressedArray );
boost::iostreams::copy( m_in, boost::iostreams::char_back_inserter( uncompressedData ) );
My question is are there any ways I can speed up the inflate (decompress) operation? Right now the compression is taking about 83% of my data access time and I really need to get this faster. Any suggestions would be greatly appreciated.
回答1:
The only way to speed up decompression is to make the compressed data smaller, so it has less to process. That means spending more time compressing, assuming that you're not as concerned about the processing time on that end. So you would select best compression.
回答2:
At least on Windows we measured 20% improvement using the zlib C interface. After close profiling it turned out that boost::iostreams::filtering_streambuf & iostream overhead was the main difference.
来源:https://stackoverflow.com/questions/13807800/boost-io-stream-and-zlib-speed-up