问题
I would like to use MurmurHash3 to uniquely identify large pieces of data. This implementation doesn't seem to provide a way to update the hash incrementally, though it seems to compute one separate hash per block of data given.
For example, if I were hashing 512MB of data from disk I might not want to load it all in memory at once, or if I were hashing an unknown amount of data from the network. How can I use MurmurHash3 for a large amount of data that has to be hashed incrementally? I am looking for something similar to SHA256_Update
from OpenSSL.
来源:https://stackoverflow.com/questions/11874473/how-to-use-murmurhash3-to-hash-a-block-of-data-incrementally