I have sets of hashes (first 64 bits of MD5, so they\'re distributed very randomly) and I want to be able to see if a new hash is in a set, and to add it to a set.
S
I had some trouble picturing your exact problem/need, but it still got me thinking about Git and how it stores SHA1-references on disk:
Take the hexadecimal string representation of a given hash, say, "abfab0da6f4ebc23cb15e04ff500ed54
". Chop the two first characters in the hash ("ab
", in our case) and make it into a directory. Then, use the rest ("fab0da6f4ebc23cb15e04ff500ed54
"), create the file, and put stuff in it.
This way, you get pretty decent performance on-disk (depending on your FS, naturally) with an automatic indexing. Additionally, you get direct access to any known hash, just by wedging a directory delimiter after the two first chars ("./ab/fab0da
[..]")
I'm sorry if I missed the ball entirely, but with any luck, this might give you an idea.
Here's the solution I eventually used:
It's just unbelievably faster than sqlite, even though it's low-level Perl code, and Perl really isn't meant for high performance databases. It will not work with anything that's less uniformly distributed than MD5, its assuming everything will be extremely uniform to keep the implementation simple.
I tried it with seek()/sysread()/syswrite() at first, and it was very slow, mmap() version is really a lot faster.
Sounds like a job for Berkeley DB.
Other disk-based hashing algos/data structures include linear hashing and extensible hashing.
Since for a hash you have to use random access, I doubt any database will give you decent performance. Your best bet might be to up the disc cache (more RAM), and get harddisks with a very high random access speed (maybe solid state disks).
Two algorithms come to my mind at first: