Lazy Method for Reading Big File in Python?

前端 未结 12 1776
谎友^
谎友^ 2020-11-21 06:40

I have a very big file 4GB and when I try to read it my computer hangs. So I want to read it piece by piece and after processing each piece store the processed piece into an

12条回答
  •  误落风尘
    2020-11-21 07:07

    Refer to python's official documentation https://docs.python.org/3/library/functions.html#iter

    Maybe this method is more pythonic:

    from functools import partial
    
    """A file object returned by open() is a iterator with
    read method which could specify current read's block size"""
    with open('mydata.db', 'r') as f_in:
    
        part_read = partial(f_in.read, 1024*1024)
        iterator = iter(part_read, b'')
    
        for index, block in enumerate(iterator, start=1):
            block = process_block(block)    # process your block data
            
            with open(f'{index}.txt', 'w') as f_out:
                f_out.write(block)
    

提交回复
热议问题