Concatenate two big pandas.HDFStore HDF5 files

自闭症网瘾萝莉.ら 提交于 2019-12-01 14:52:48

问题


This question is somehow related to "Concatenate a large number of HDF5 files".

I have several huge HDF5 files (~20GB compressed), which could not fit the RAM. Each of them stores several pandas.DataFrames of identical format and with indexes that do not overlap.

I'd like to concatenate them to have a single HDF5 file with all DataFrames properly concatenated. One way to do this is to read each of them chunk-by-chunk and then save to a single file, but indeed it would take quite a lot of time.

Are there any special tools or methods to do this without iterating through files?


回答1:


see docs here for the odo project (formerly into). Note if you use the into library, then the argument order has been switched (that was the motivation for changing the name, to avoid confusion!)

You can basically do:

from odo import odo
odo('hdfstore://path_store_1::table_name',
    'hdfstore://path_store_new_name::table_name')

doing multiple operations like this will append to the rhs store.

This will automatically do the chunk operations for you.



来源:https://stackoverflow.com/questions/28918851/concatenate-two-big-pandas-hdfstore-hdf5-files

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!