Lazy Method for Reading Big File in Python?

前端 未结 12 1774
谎友^
谎友^ 2020-11-21 06:40

I have a very big file 4GB and when I try to read it my computer hangs. So I want to read it piece by piece and after processing each piece store the processed piece into an

12条回答
  •  旧巷少年郎
    2020-11-21 07:04

    f = ... # file-like object, i.e. supporting read(size) function and 
            # returning empty string '' when there is nothing to read
    
    def chunked(file, chunk_size):
        return iter(lambda: file.read(chunk_size), '')
    
    for data in chunked(f, 65536):
        # process the data
    

    UPDATE: The approach is best explained in https://stackoverflow.com/a/4566523/38592

提交回复
热议问题