Cannot use 'fread' to files of 3+ Gb - Python datatable

混江龙づ霸主 提交于 2020-12-15 06:09:30

问题


Recently I've started to use datatable to manipulate data and faced the following issue:

The fread() function used to read 3+ GB files returns the error message. At the same moment pandas reads this file without any issues.

The code and error message are below:

Code:

import datatable as dt

data = dt.fread(file='LF.csv')

Error:

Traceback (most recent call last):
 D:/.../datatable test.py:8 in <module>  data = dt.fread(file='LF.csv')

IOError: Unable to obtain size of LF.csv: [errno 132] value too large

Could someone advice on that please? If any additional information is needed please let me know.

Thank you in advance!

来源:https://stackoverflow.com/questions/64534891/cannot-use-fread-to-files-of-3-gb-python-datatable

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!