问题
Recently I've started to use datatable to manipulate data and faced the following issue:
The fread()
function used to read 3+ GB files returns the error message.
At the same moment pandas reads this file without any issues.
The code and error message are below:
Code:
import datatable as dt
data = dt.fread(file='LF.csv')
Error:
Traceback (most recent call last):
D:/.../datatable test.py:8 in <module> data = dt.fread(file='LF.csv')
IOError: Unable to obtain size of LF.csv: [errno 132] value too large
Could someone advice on that please? If any additional information is needed please let me know.
Thank you in advance!
来源:https://stackoverflow.com/questions/64534891/cannot-use-fread-to-files-of-3-gb-python-datatable