Lazy Method for Reading Big File in Python?

前端 未结 12 1749
谎友^
谎友^ 2020-11-21 06:40

I have a very big file 4GB and when I try to read it my computer hangs. So I want to read it piece by piece and after processing each piece store the processed piece into an

12条回答
  •  有刺的猬
    2020-11-21 06:56

    I'm in a somewhat similar situation. It's not clear whether you know chunk size in bytes; I usually don't, but the number of records (lines) that is required is known:

    def get_line():
         with open('4gb_file') as file:
             for i in file:
                 yield i
    
    lines_required = 100
    gen = get_line()
    chunk = [i for i, j in zip(gen, range(lines_required))]
    

    Update: Thanks nosklo. Here's what I meant. It almost works, except that it loses a line 'between' chunks.

    chunk = [next(gen) for i in range(lines_required)]
    

    Does the trick w/o losing any lines, but it doesn't look very nice.

提交回复
热议问题