I am currently using this type of SQL on MySQL to insert multiple rows of values in one single query:
INSERT INTO `tbl` (`key1`,`key2`) VALUES (\'r1v1\',\'r1
Since it has not been suggested yet, I'm pretty sure LOAD DATA INFILE is still the fastest way to load data as it disables indexing, inserts all data, and then re-enables the indexes - all in a single request.
Saving the data as a csv should be fairly trivial keeping in mind fputcsv. MyISAM is fastest, but you still get big performance in InnoDB. There are other disadvantages, though so I would go this route if you are inserting a lot of data, and not bother with under 100 rows.
Same answer as Mr. Balagtas, slightly clearer...
Recent versions MySQL and PHP PDO do support multi-row INSERT
statements.
The SQL will look something like this, assuming a 3-column table you'd like to INSERT
to.
INSERT INTO tbl_name
(colA, colB, colC)
VALUES (?, ?, ?), (?, ?, ?), (?, ?, ?) [,...]
ON DUPLICATE KEY UPDATE
works as expected even with a multi-row INSERT; append this:
ON DUPLICATE KEY UPDATE colA = VALUES(colA), colB = VALUES(colB), colC = VALUES(colC)
Your PHP code will follow the usual $pdo->prepare($qry)
and $stmt->execute($params)
PDO calls.
$params
will be a 1-dimensional array of all the values to pass to the INSERT
.
In the above example, it should contain 9 elements; PDO will use every set of 3 as a single row of values. (Inserting 3 rows of 3 columns each = 9 element array.)
Below code is written for clarity, not efficiency. Work with the PHP array_*()
functions for better ways to map or walk through your data if you'd like. Whether you can use transactions obviously depends on your MySQL table type.
Assuming:
$tblName
- the string name of the table to INSERT to$colNames
- 1-dimensional array of the column names of the table
These column names must be valid MySQL column identifiers; escape them with backticks (``) if they are not$dataVals
- mutli-dimensional array, where each element is a 1-d array of a row of values to INSERT// setup data values for PDO
// memory warning: this is creating a copy all of $dataVals
$dataToInsert = array();
foreach ($dataVals as $row => $data) {
foreach($data as $val) {
$dataToInsert[] = $val;
}
}
// (optional) setup the ON DUPLICATE column names
$updateCols = array();
foreach ($colNames as $curCol) {
$updateCols[] = $curCol . " = VALUES($curCol)";
}
$onDup = implode(', ', $updateCols);
// setup the placeholders - a fancy way to make the long "(?, ?, ?)..." string
$rowPlaces = '(' . implode(', ', array_fill(0, count($colNames), '?')) . ')';
$allPlaces = implode(', ', array_fill(0, count($dataVals), $rowPlaces));
$sql = "INSERT INTO $tblName (" . implode(', ', $colNames) .
") VALUES " . $allPlaces . " ON DUPLICATE KEY UPDATE $onDup";
// and then the PHP PDO boilerplate
$stmt = $pdo->prepare ($sql);
try {
$stmt->execute($dataToInsert);
} catch (PDOException $e){
echo $e->getMessage();
}
$pdo->commit();
For what it is worth, I have seen a lot of users recommend iterating through INSERT statements instead of building out as a single string query as the selected answer did. I decided to run a simple test with just two fields and a very basic insert statement:
<?php
require('conn.php');
$fname = 'J';
$lname = 'M';
$time_start = microtime(true);
$stmt = $db->prepare('INSERT INTO table (FirstName, LastName) VALUES (:fname, :lname)');
for($i = 1; $i <= 10; $i++ ) {
$stmt->bindParam(':fname', $fname);
$stmt->bindParam(':lname', $lname);
$stmt->execute();
$fname .= 'O';
$lname .= 'A';
}
$time_end = microtime(true);
$time = $time_end - $time_start;
echo "Completed in ". $time ." seconds <hr>";
$fname2 = 'J';
$lname2 = 'M';
$time_start2 = microtime(true);
$qry = 'INSERT INTO table (FirstName, LastName) VALUES ';
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?), ";
$qry .= "(?,?)";
$stmt2 = $db->prepare($qry);
$values = array();
for($j = 1; $j<=10; $j++) {
$values2 = array($fname2, $lname2);
$values = array_merge($values,$values2);
$fname2 .= 'O';
$lname2 .= 'A';
}
$stmt2->execute($values);
$time_end2 = microtime(true);
$time2 = $time_end2 - $time_start2;
echo "Completed in ". $time2 ." seconds <hr>";
?>
While the overall query itself took milliseconds or less, the latter (single string) query was consistently 8 times faster or more. If this was built out to say reflect an import of thousands of rows on many more columns, the difference could be enormous.
I had the same problem and this is how i accomplish for myself, and i made a function for myself for it ( and you can use it if that helps you).
Example:
INSERT INTO countries (country, city) VALUES (Germany, Berlin), (France, Paris);
$arr1 = Array("Germany", "Berlin");
$arr2 = Array("France", "France");
insertMultipleData("countries", Array($arr1, $arr2));
// Inserting multiple data to the Database.
public function insertMultipleData($table, $multi_params){
try{
$db = $this->connect();
$beforeParams = "";
$paramsStr = "";
$valuesStr = "";
for ($i=0; $i < count($multi_params); $i++) {
foreach ($multi_params[$i] as $j => $value) {
if ($i == 0) {
$beforeParams .= " " . $j . ",";
}
$paramsStr .= " :" . $j . "_" . $i .",";
}
$paramsStr = substr_replace($paramsStr, "", -1);
$valuesStr .= "(" . $paramsStr . "),";
$paramsStr = "";
}
$beforeParams = substr_replace($beforeParams, "", -1);
$valuesStr = substr_replace($valuesStr, "", -1);
$sql = "INSERT INTO " . $table . " (" . $beforeParams . ") VALUES " . $valuesStr . ";";
$stmt = $db->prepare($sql);
for ($i=0; $i < count($multi_params); $i++) {
foreach ($multi_params[$i] as $j => &$value) {
$stmt->bindParam(":" . $j . "_" . $i, $value);
}
}
$this->close($db);
$stmt->execute();
return true;
}catch(PDOException $e){
return false;
}
return false;
}
// Making connection to the Database
public function connect(){
$host = Constants::DB_HOST;
$dbname = Constants::DB_NAME;
$user = Constants::DB_USER;
$pass = Constants::DB_PASS;
$mysql_connect_str = 'mysql:host='. $host . ';dbname=' .$dbname;
$dbConnection = new PDO($mysql_connect_str, $user, $pass);
$dbConnection->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
return $dbConnection;
}
// Closing the connection
public function close($db){
$db = null;
}
If insertMultipleData($table, $multi_params) returns TRUE, your data has been inserted to your database.
Here is another (slim) solution for this issue:
At first you need to count the data of the source array (here: $aData) with count(). Then you use array_fill() and generate a new array wich as many entries as the source array has, each with the value "(?,?)" (the number of placeholders depends on the fields you use; here: 2). Then the generated array needs to be imploded and as glue a comma is used. Within the foreach loop, you need to generate another index regarding on the number of placeholders you use (number of placeholders * current array index + 1). You need to add 1 to the generated index after each binded value.
$do = $db->prepare("INSERT INTO table (id, name) VALUES ".implode(',', array_fill(0, count($aData), '(?,?)')));
foreach($aData as $iIndex => $aValues){
$iRealIndex = 2 * $iIndex + 1;
$do->bindValue($iRealIndex, $aValues['id'], PDO::PARAM_INT);
$iRealIndex = $iRealIndex + 1;
$do->bindValue($iRealIndex, $aValues['name'], PDO::PARAM_STR);
}
$do->execute();
My real world example to insert all german postcodes into an empty table (to add town names later):
// obtain column template
$stmt = $db->prepare('SHOW COLUMNS FROM towns');
$stmt->execute();
$columns = array_fill_keys(array_values($stmt->fetchAll(PDO::FETCH_COLUMN)), null);
// multiple INSERT
$postcode = '01000';// smallest german postcode
while ($postcode <= 99999) {// highest german postcode
$values = array();
while ($postcode <= 99999) {
// reset row
$row = $columns;
// now fill our row with data
$row['postcode'] = sprintf('%05d', $postcode);
// build INSERT array
foreach ($row as $value) {
$values[] = $value;
}
$postcode++;
// avoid memory kill
if (!($postcode % 10000)) {
break;
}
}
// build query
$count_columns = count($columns);
$placeholder = ',(' . substr(str_repeat(',?', $count_columns), 1) . ')';//,(?,?,?)
$placeholder_group = substr(str_repeat($placeholder, count($values) / $count_columns), 1);//(?,?,?),(?,?,?)...
$into_columns = implode(',', array_keys($columns));//col1,col2,col3
// this part is optional:
$on_duplicate = array();
foreach ($columns as $column => $row) {
$on_duplicate[] = $column;
$on_duplicate[] = $column;
}
$on_duplicate = ' ON DUPLICATE KEY UPDATE' . vsprintf(substr(str_repeat(', %s = VALUES(%s)', $count_columns), 1), $on_duplicate);
// execute query
$stmt = $db->prepare('INSERT INTO towns (' . $into_columns . ') VALUES' . $placeholder_group . $on_duplicate);//INSERT INTO towns (col1,col2,col3) VALUES(?,?,?),(?,?,?)... {ON DUPLICATE...}
$stmt->execute($values);
}
As you can see its fully flexible. You don't need to check the amount of columns or check on which position your column is. You only need to set the insert data:
$row['postcode'] = sprintf('%05d', $postcode);
I'm proud of some of the query string constructors as they work without heavy array-functions like array_merge. Especially vsprintf() was a good find.
Finally I needed to add 2x while() to avoid exceeding the memory limit. This depends on your memory limit but at all its a good general solution to avoid problems (and having 10 queries is still much better than 10.000).