How do I read a large csv file with pandas?

后端 未结 15 1883
隐瞒了意图╮
隐瞒了意图╮ 2020-11-21 07:12

I am trying to read a large csv file (aprox. 6 GB) in pandas and i am getting a memory error:

MemoryError                               Traceback (most recen         


        
15条回答
  •  遥遥无期
    2020-11-21 07:16

    I proceeded like this:

    chunks=pd.read_table('aphro.csv',chunksize=1000000,sep=';',\
           names=['lat','long','rf','date','slno'],index_col='slno',\
           header=None,parse_dates=['date'])
    
    df=pd.DataFrame()
    %time df=pd.concat(chunk.groupby(['lat','long',chunk['date'].map(lambda x: x.year)])['rf'].agg(['sum']) for chunk in chunks)
    

提交回复
热议问题