Big file compression with python

醉酒当歌 提交于 2019-12-18 11:03:23

问题


I want to compress big text files with python (I am talking about >20Gb files). I am not any how an expert so I tried to gather the info I found and the following seems to work :

import bz2

with open('bigInputfile.txt', 'rb') as input:
    with bz2.BZ2File('bigInputfile.txt.bz2', 'wb', compresslevel = 9) as output:
        while True:
            block = input.read(900000)
                if not block:
                    break
                output.write(block)

input.close()
output.close()

I am wondering if this syntax is correct and if there is a way to optimize it ? I have an impression that I am missing something here.

Many thanks.


回答1:


Your script seems correct, but can be abbreviated:

from shutil import copyfileobj

with open('bigInputfile.txt', 'rb') as input:
    with bz2.BZ2File('bigInputfile.txt.bz2', 'wb', compresslevel=9) as output:
        copyfileobj(input, output)



回答2:


Why are you calling the .close() methods? They are not needed as you use the with: statement.



来源:https://stackoverflow.com/questions/9518705/big-file-compression-with-python

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!