Lazy Method for Reading Big File in Python?

前端 未结 12 1770
谎友^
谎友^ 2020-11-21 06:40

I have a very big file 4GB and when I try to read it my computer hangs. So I want to read it piece by piece and after processing each piece store the processed piece into an

12条回答
  •  伪装坚强ぢ
    2020-11-21 07:16

    In Python 3.8+ you can use .read() in a while loop:

    with open("somefile.txt") as f:
        while chunk := f.read(8192):
            do_something(chunk)
    

    Of course, you can use any chunk size you want, you don't have to use 8192 (2**13) bytes. Unless your file's size happens to be a multiple of your chunk size, the last chunk will be smaller than your chunk size.

提交回复
热议问题