Fastest way to store large files in Python

前端 未结 5 2046
佛祖请我去吃肉
佛祖请我去吃肉 2021-02-04 09:45

I recently asked a question regarding how to save large python objects to file. I had previously run into problems converting massive Python dictionaries into string and writing

5条回答
  •  逝去的感伤
    2021-02-04 10:26

    You can compress the data with bzip2:

    from __future__ import with_statement # Only for Python 2.5
    import bz2,json,contextlib
    
    hugeData = {'key': {'x': 1, 'y':2}}
    with contextlib.closing(bz2.BZ2File('data.json.bz2', 'wb')) as f:
      json.dump(hugeData, f)
    

    Load it like this:

    from __future__ import with_statement # Only for Python 2.5
    import bz2,json,contextlib
    
    with contextlib.closing(bz2.BZ2File('data.json.bz2', 'rb')) as f:
      hugeData = json.load(f)
    

    You can also compress the data using zlib or gzip with pretty much the same interface. However, both zlib and gzip's compression rates will be lower than the one achieved with bzip2 (or lzma).

提交回复
热议问题