Copying million rows of data from one database to another on laravel

余生长醉 提交于 2019-12-11 04:03:45

问题


Using Laravel Eloquent, i'm copying 7Million rows of data from one table on my old Mysql database and putting these rows on different tables on my new Mysql database. The problem is that it took almost one day to perform this and i need to re-perform this action for almost 80M of rows. I'm using chunk of 1000 data at a time. Is there any way to do it more efficiently?? Here my code:

   DB::connection('oldDataBase')->table('tableToCopy')->chunk(1000, function ($AllData){
        foreach ($AllData as $Data){
            DB::connection('newDataBase')->table('table1')->insert(
                [
                    column1 => $Data->columnToCopy,
                    etc..
                ]);


           DB::connection('newDataBase')->table('table2')->insert(
                [
                    column1 => $Data->columnToCopy,
                    etc..
                ]);
           DB::connection('newDataBase')->table('table3')->insert(
                [
                    column1 => $Data->columnToCopy,
                    etc..
                ]);
        }
    });

回答1:


Doing this data migration from a SQL client like laravel is not a good idea.

If I had to move 80M rows, I'd take the following steps:

  1. dump them in CSV form.
  2. split the CSV files into chunks of something like 500K rows each.
  3. create tables on the target system
  4. disable all constraints and indexes on the target system.
  5. use LOAD DATA INFILE to slurp up the CSV files one after the other. For fastest results this should be run from the mysql or mysqlimport command line client program running on the same machine as the MySQL server.
  6. re-enable the constraints and build the indexes.

I'd test this extensively before the migration day. I'd do things like load the first and last chunk of CSV and re-enable indexes and constraints.

Another possibility was suggested in a comment. Use mysqldump, then load the resulting file via the mysql client program.

Avoid the use of a gui-style mysql client program for this. Stick with the command-line programs. As good as those GUI clients are, they aren't engineered for streaming in multi-tens-of-megabyte .SQL files.



来源:https://stackoverflow.com/questions/42012233/copying-million-rows-of-data-from-one-database-to-another-on-laravel

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!