I have a very big file 4GB and when I try to read it my computer hangs. So I want to read it piece by piece and after processing each piece store the processed piece into an
Refer to python's official documentation https://docs.python.org/3/library/functions.html#iter
Maybe this method is more pythonic:
from functools import partial
"""A file object returned by open() is a iterator with
read method which could specify current read's block size"""
with open('mydata.db', 'r') as f_in:
part_read = partial(f_in.read, 1024*1024)
iterator = iter(part_read, b'')
for index, block in enumerate(iterator, start=1):
block = process_block(block) # process your block data
with open(f'{index}.txt', 'w') as f_out:
f_out.write(block)