问题
In my table I insert around 20,000 rows on each load. Right now I am doing it one-by-one. From mysql website I came to know inserting multiple rows with single insert query is faster.
Can I insert all 20000 in single query?
What will happen if there are errors within this 20000 rows? how will mysql handle that?
回答1:
If you are inserting the rows from some other table then you can use the INSERT ... SELECT pattern to insert the rows.
However if you are inserting the values using INSERT ... VALUES
pattern then you have the limit of max_allowed_packet.
Also from the docs:-
To optimize insert speed, combine many small operations into a single large operation. Ideally, you make a single connection, send the data for many new rows at once, and delay all index updates and consistency checking until the very end.
Example:-
INSERT INTO `table1` (`column1`, `column2`) VALUES ("d1", "d2"),
("d1", "d2"),
("d1", "d2"),
("d1", "d2"),
("d1", "d2");
What will happen if there are errors within this 20000 rows?
If there are errors while inserting the records then the operation will be aborted.
回答2:
http://dev.mysql.com/doc/refman/5.5/en/insert.html
INSERT statements that use VALUES syntax can insert multiple rows. To do this, include multiple lists of column values, each enclosed within parentheses and separated by commas.
Example:
INSERT INTO tbl_name (a,b,c) VALUES(1,2,3),(4,5,6),(7,8,9);
You can use code to generate the insert VALUES
section based on your data source.
Errors: if there are errors in the INSERT
statement (including in any of the rows) the operation will be aborted.
Generating the query - this will be based on your data source, for example, if you are getting data from an associative array in PHP, you'll do something like this:
$sql = "INSERT INTO tbl_name (a, b, c) VALUES ";
foreach($dataset as $row)
{
$sql .= "(" + $row['a'] + ", " + $row['a'] + ", " + $row['a'] + ")";
// OR
$sql .= "($row[a], $row[b], $row[c])";
}
Some more resources:
Optimize MySQL Queries – Fast Inserts With Multiple Rows
The fastest way to insert 100K records
回答3:
batch insert with SQL: insert into table (col...coln) values (col... coln),(col...coln)... but the SQL length is limited by 1M default, you can change max_allowed_packet parameter to support more bigger single insert
来源:https://stackoverflow.com/questions/22164070/mysql-insert-20k-rows-in-single-insert