I am currently using this type of SQL on MySQL to insert multiple rows of values in one single query:
INSERT INTO `tbl` (`key1`,`key2`) VALUES (\'r1v1\',\'r1
My real world example to insert all german postcodes into an empty table (to add town names later):
// obtain column template
$stmt = $db->prepare('SHOW COLUMNS FROM towns');
$stmt->execute();
$columns = array_fill_keys(array_values($stmt->fetchAll(PDO::FETCH_COLUMN)), null);
// multiple INSERT
$postcode = '01000';// smallest german postcode
while ($postcode <= 99999) {// highest german postcode
$values = array();
while ($postcode <= 99999) {
// reset row
$row = $columns;
// now fill our row with data
$row['postcode'] = sprintf('%05d', $postcode);
// build INSERT array
foreach ($row as $value) {
$values[] = $value;
}
$postcode++;
// avoid memory kill
if (!($postcode % 10000)) {
break;
}
}
// build query
$count_columns = count($columns);
$placeholder = ',(' . substr(str_repeat(',?', $count_columns), 1) . ')';//,(?,?,?)
$placeholder_group = substr(str_repeat($placeholder, count($values) / $count_columns), 1);//(?,?,?),(?,?,?)...
$into_columns = implode(',', array_keys($columns));//col1,col2,col3
// this part is optional:
$on_duplicate = array();
foreach ($columns as $column => $row) {
$on_duplicate[] = $column;
$on_duplicate[] = $column;
}
$on_duplicate = ' ON DUPLICATE KEY UPDATE' . vsprintf(substr(str_repeat(', %s = VALUES(%s)', $count_columns), 1), $on_duplicate);
// execute query
$stmt = $db->prepare('INSERT INTO towns (' . $into_columns . ') VALUES' . $placeholder_group . $on_duplicate);//INSERT INTO towns (col1,col2,col3) VALUES(?,?,?),(?,?,?)... {ON DUPLICATE...}
$stmt->execute($values);
}
As you can see its fully flexible. You don't need to check the amount of columns or check on which position your column is. You only need to set the insert data:
$row['postcode'] = sprintf('%05d', $postcode);
I'm proud of some of the query string constructors as they work without heavy array-functions like array_merge. Especially vsprintf() was a good find.
Finally I needed to add 2x while() to avoid exceeding the memory limit. This depends on your memory limit but at all its a good general solution to avoid problems (and having 10 queries is still much better than 10.000).