Cannot use 'fread' to files of 3+ Gb - Python datatable
问题 Recently I've started to use datatable to manipulate data and faced the following issue: The fread() function used to read 3+ GB files returns the error message. At the same moment pandas reads this file without any issues. The code and error message are below: Code: import datatable as dt data = dt.fread(file='LF.csv') Error: Traceback (most recent call last): D:/.../datatable test.py:8 in <module> data = dt.fread(file='LF.csv') IOError: Unable to obtain size of LF.csv: [errno 132] value too