Symfony : Doctrine data fixture : how to handle large csv file?

前端 未结 3 1900
刺人心
刺人心 2021-02-20 12:35

I am trying to insert (in a mySQL database) datas from a \"large\" CSV file (3Mo / 37000 lines / 7 columns) using doctrine data fixtures.

The process is very slow and at

3条回答
  •  别那么骄傲
    2021-02-20 12:43

    Two rules to follow when you create big batch imports like this:

    • Disable SQL Logging: ($manager->getConnection()->getConfiguration()->setSQLLogger(null);) to avoid huge memory loss.

    • Flush and clear frequently instead of only once at the end. I suggest you add if ($i % 25 == 0) { $manager->flush(); $manager->clear() } inside your loop, to flush every 25 INSERTs.

    EDIT: One last thing I forgot: don't keep your entities inside variables when you don't need them anymore. Here, in your loop, you only need the current entity that is being processed, so don't store previous entity in a $coordinatesfrcity array. This might lead you to memory overflow if you keep doing that.

提交回复
热议问题