How do I read a large csv file with pandas?

后端 未结 15 1906
隐瞒了意图╮
隐瞒了意图╮ 2020-11-21 07:12

I am trying to read a large csv file (aprox. 6 GB) in pandas and i am getting a memory error:

MemoryError                               Traceback (most recen         


        
15条回答
  •  春和景丽
    2020-11-21 07:25

    Before using chunksize option if you want to be sure about the process function that you want to write inside the chunking for-loop as mentioned by @unutbu you can simply use nrows option.

    small_df = pd.read_csv(filename, nrows=100)
    

    Once you are sure that the process block is ready, you can put that in the chunking for loop for the entire dataframe.

提交回复
热议问题