I\'m running a program which is processing 30,000 similar files. A random number of them are stopping and producing this error...
File "C:\\Importer\\src
You can try with:
df = pd.read_csv('./file_name.csv', encoding='gbk')
Try specifying the engine='python'. It worked for me but I'm still trying to figure out why.
df = pd.read_csv(input_file_path,...engine='python')
Please try to add
encoding='unicode_escape'
This will help. Worked for me. Also, make sure you're using the correct delimiter and column names.
You can start with loading just 1000 rows to load the file quickly.
In my case this worked for python 2.7:
data = read_csv(filename, encoding = "ISO-8859-1", dtype={'name_of_colum': unicode}, low_memory=False)
And for python 3, only:
data = read_csv(filename, encoding = "ISO-8859-1", low_memory=False)
You can try this.
import csv
import pandas as pd
df = pd.read_csv(filepath,encoding='unicode_escape')
I am using Jupyter-notebook. And in my case, it was showing the file in the wrong format. The 'encoding' option was not working. So I save the csv in utf-8 format, and it works.