问题
I am using Python's shelve
module for sometime. It almost feels like a very good solution for on-disk version of a Python dictionary. My question is: if its serialized on the disk and loading it doesn't load the whole shelve-d object in memory, what is the read complexity associated with it?
Let's say, I have an object containing a few million key/value pairs stored in a shelve db. I usually go about:
import shelve
key = "1234"
d = shelve.open("path/to/shelve/db")
output = d[key]
d.close()
And almost always the output returns in a constant time (as if the object is loaded into the memory like a Python dictionary). It baffles me about how is it possible when an object is serialized on disk and is still giving O(1) read complexity?
I would appreciate any explanation about how shelve
operates or what goes on in its backend (as I am not able to find good explanations in their docs).
来源:https://stackoverflow.com/questions/57739653/read-complexity-for-python-shelve