I am trying to insert (in a mySQL database) datas from a \"large\" CSV file (3Mo / 37000 lines / 7 columns) using doctrine data fixtures.
The process is very slow and at
Two rules to follow when you create big batch imports like this:
Disable SQL Logging: ($manager->getConnection()->getConfiguration()->setSQLLogger(null);
) to avoid huge memory loss.
Flush and clear frequently instead of only once at the end. I suggest you add if ($i % 25 == 0) { $manager->flush(); $manager->clear() }
inside your loop, to flush every 25 INSERTs.
EDIT: One last thing I forgot: don't keep your entities inside variables when you don't need them anymore. Here, in your loop, you only need the current entity that is being processed, so don't store previous entity in a $coordinatesfrcity
array. This might lead you to memory overflow if you keep doing that.