问题
Using Laravel Eloquent, i'm copying 7Million rows of data from one table on my old Mysql database and putting these rows on different tables on my new Mysql database. The problem is that it took almost one day to perform this and i need to re-perform this action for almost 80M of rows. I'm using chunk of 1000 data at a time. Is there any way to do it more efficiently?? Here my code:
DB::connection('oldDataBase')->table('tableToCopy')->chunk(1000, function ($AllData){
foreach ($AllData as $Data){
DB::connection('newDataBase')->table('table1')->insert(
[
column1 => $Data->columnToCopy,
etc..
]);
DB::connection('newDataBase')->table('table2')->insert(
[
column1 => $Data->columnToCopy,
etc..
]);
DB::connection('newDataBase')->table('table3')->insert(
[
column1 => $Data->columnToCopy,
etc..
]);
}
});
回答1:
Doing this data migration from a SQL client like laravel is not a good idea.
If I had to move 80M rows, I'd take the following steps:
- dump them in CSV form.
- split the CSV files into chunks of something like 500K rows each.
- create tables on the target system
- disable all constraints and indexes on the target system.
- use LOAD DATA INFILE to slurp up the CSV files one after the other. For fastest results this should be run from the
mysql
or mysqlimport command line client program running on the same machine as the MySQL server. - re-enable the constraints and build the indexes.
I'd test this extensively before the migration day. I'd do things like load the first and last chunk of CSV and re-enable indexes and constraints.
Another possibility was suggested in a comment. Use mysqldump
, then load the resulting file via the mysql
client program.
Avoid the use of a gui-style mysql client program for this. Stick with the command-line programs. As good as those GUI clients are, they aren't engineered for streaming in multi-tens-of-megabyte .SQL files.
来源:https://stackoverflow.com/questions/42012233/copying-million-rows-of-data-from-one-database-to-another-on-laravel