I am trying to insert (in a mySQL database) datas from a \"large\" CSV file (3Mo / 37000 lines / 7 columns) using doctrine data fixtures.
The process is very slow and at
For fixtures which need lots of memory but don't depend on each other, I get around this problem by using the append flag to insert one entity (or smaller group of entities) at a time:
bin/console doctrine:fixtures:load --fixtures="memory_hungry_fixture.file" --append
Then I write a Bash script which runs that command as many times as I need.
In your case, you could extend the Fixtures command and have a flag which does batches of entities - the first 1000 rows, then the 2nd 1000, etc.