Processing a very very big data set in python - memory error
问题 I'm trying to process data obtained from a csv file using csv module in python. there are about 50 columns & 401125 rows in this. I used the following code chunk to put that data into a list csv_file_object = csv.reader(open(r'some_path\Train.csv','rb')) header = csv_file_object.next() data = [] for row in csv_file_object: data.append(row) I can get length of this list using len(data) & it returns 401125. I can even get each individual record by calling list indices. But when I try to get the