Symfony : Doctrine data fixture : how to handle large csv file?

前端 未结 3 1917
刺人心
刺人心 2021-02-20 12:35

I am trying to insert (in a mySQL database) datas from a \"large\" CSV file (3Mo / 37000 lines / 7 columns) using doctrine data fixtures.

The process is very slow and at

3条回答
  •  庸人自扰
    2021-02-20 12:56

    There is a great example in the Docs: http://doctrine-orm.readthedocs.org/projects/doctrine-orm/en/latest/reference/batch-processing.html

    Use a modulo (x % y) expression to implement batch processing, this example will insert 20 at a time. You may be able to optimise this depending on your server.

    $batchSize = 20;
    for ($i = 1; $i <= 10000; ++$i) {
        $user = new CmsUser;
        $user->setStatus('user');
        $user->setUsername('user' . $i);
        $user->setName('Mr.Smith-' . $i);
        $em->persist($user);
        if (($i % $batchSize) === 0) {
            $em->flush();
            $em->clear(); // Detaches all objects from Doctrine!
        }
    }
    $em->flush(); //Persist objects that did not make up an entire batch
    $em->clear();
    

提交回复
热议问题