I am trying to load a csv file (around 250 MB) as dataframe with pandas. In my first try I used the typical read_csv command but I receive an Error memory. I have tried the appr
I suggest that you install the 64 Bit version of winpython. Then you should be able to load a 250 MB file without problems.
I'm late, but the actual problem with the posted code is that using pd.concat([chunk for chunk in x])
effectively cancels any benefit of chunking because it concatenates all those chunks into one big DataFrame again.
That probably even requires twice the memory temporarily.