问题
I'm using HDFStore with pandas / pytables.
After removing a table or object, hdf5 file size remains unaffected. It seems this space is reused afterwards when additional objects are added to store, but it can be an issue if large space is wasted.
I have not found any command in pandas nor pytables APIs that might be used to recover hdf5 memory.
Do you know of any mechanism to improve data management in hdf5 files?
回答1:
see here
you need to ptrepack
it, which rewrites the file.
ptrepack --chunkshape=auto --propindexes --complevel=9 --complib=blosc in.h5 out.h5
as an example (this will also compress the file).
来源:https://stackoverflow.com/questions/21090243/release-hdf5-disk-memory-after-table-or-node-removal-with-pytables-or-pandas