I am currently using this type of SQL on MySQL to insert multiple rows of values in one single query:
INSERT INTO `tbl` (`key1`,`key2`) VALUES (\'r1v1\',\'r1
Here is my solution: https://github.com/sasha-ch/Aura.Sql based on auraphp/Aura.Sql library.
Usage example:
$q = "insert into t2(id,name) values (?,?), ... on duplicate key update name=name";
$bind_values = [ [[1,'str1'],[2,'str2']] ];
$pdo->perform($q, $bind_values);
Bugreports are welcome.
This is how I did it:
First define the column names you'll use, or leave it blank and pdo will assume you want to use all the columns on the table - in which case you'll need to inform the row values in the exact order they appear on the table.
$cols = 'name', 'middleName', 'eMail';
$table = 'people';
Now, suppose you have a two dimensional array already prepared. Iterate it, and construct a string with your row values, as such:
foreach ( $people as $person ) {
if(! $rowVals ) {
$rows = '(' . "'$name'" . ',' . "'$middleName'" . ',' . "'$eMail'" . ')';
} else { $rowVals = '(' . "'$name'" . ',' . "'$middleName'" . ',' . "'$eMail'" . ')';
}
Now, what you just did was check if $rows was already defined, and if not, create it and store row values and the necessary SQL syntax so it will be a valid statement. Note that strings should go inside double quotes and single quotes, so they will be promptly recognized as such.
All it's left to do is prepare the statement and execute, as such:
$stmt = $db->prepare ( "INSERT INTO $table $cols VALUES $rowVals" );
$stmt->execute ();
Tested with up to 2000 rows so far, and the execution time is dismal. Will run some more tests and will get back here in case I have something further to contribute.
Regards.
Based on my experiments I found out that mysql insert statement with multiple value rows in single transaction is the fastest one.
However, if the data is too much then mysql's max_allowed_packet
setting might restrict the single transaction insert with multiple value rows. Hence, following functions will fail when there is data greater than mysql's max_allowed_packet
size:
singleTransactionInsertWithRollback
singleTransactionInsertWithPlaceholders
singleTransactionInsert
The most successful one in insert huge data scenario is transactionSpeed
method, but it consumes time more the above mentioned methods. So, to handle this problem you can either split your data into smaller chunks and call single transaction insert multiple times or give up speed of execution by using transactionSpeed
method.
Here's my research
<?php
class SpeedTestClass
{
private $data;
private $pdo;
public function __construct()
{
$this->data = [];
$this->pdo = new \PDO('mysql:dbname=test_data', 'admin', 'admin');
if (!$this->pdo) {
die('Failed to connect to database');
}
}
public function createData()
{
$prefix = 'test';
$postfix = 'unicourt.com';
$salutations = ['Mr.', 'Ms.', 'Dr.', 'Mrs.'];
$csv[] = ['Salutation', 'First Name', 'Last Name', 'Email Address'];
for ($i = 0; $i < 100000; ++$i) {
$csv[] = [
$salutations[$i % \count($salutations)],
$prefix.$i,
$prefix.$i,
$prefix.$i.'@'.$postfix,
];
}
$this->data = $csv;
}
public function truncateTable()
{
$this->pdo->query('TRUNCATE TABLE `name`');
}
public function transactionSpeed()
{
$timer1 = microtime(true);
$this->pdo->beginTransaction();
$sql = 'INSERT INTO `name` (`first_name`, `last_name`) VALUES (:first_name, :last_name)';
$sth = $this->pdo->prepare($sql);
foreach (\array_slice($this->data, 1) as $values) {
$sth->execute([
':first_name' => $values[1],
':last_name' => $values[2],
]);
}
// $timer2 = microtime(true);
// echo 'Prepare Time: '.($timer2 - $timer1).PHP_EOL;
// $timer3 = microtime(true);
if (!$this->pdo->commit()) {
echo "Commit failed\n";
}
$timer4 = microtime(true);
// echo 'Commit Time: '.($timer4 - $timer3).PHP_EOL;
return $timer4 - $timer1;
}
public function autoCommitSpeed()
{
$timer1 = microtime(true);
$sql = 'INSERT INTO `name` (`first_name`, `last_name`) VALUES (:first_name, :last_name)';
$sth = $this->pdo->prepare($sql);
foreach (\array_slice($this->data, 1) as $values) {
$sth->execute([
':first_name' => $values[1],
':last_name' => $values[2],
]);
}
$timer2 = microtime(true);
return $timer2 - $timer1;
}
public function noBindAutoCommitSpeed()
{
$timer1 = microtime(true);
foreach (\array_slice($this->data, 1) as $values) {
$sth = $this->pdo->prepare("INSERT INTO `name` (`first_name`, `last_name`) VALUES ('{$values[1]}', '{$values[2]}')");
$sth->execute();
}
$timer2 = microtime(true);
return $timer2 - $timer1;
}
public function singleTransactionInsert()
{
$timer1 = microtime(true);
foreach (\array_slice($this->data, 1) as $values) {
$arr[] = "('{$values[1]}', '{$values[2]}')";
}
$sth = $this->pdo->prepare('INSERT INTO `name` (`first_name`, `last_name`) VALUES '.implode(', ', $arr));
$sth->execute();
$timer2 = microtime(true);
return $timer2 - $timer1;
}
public function singleTransactionInsertWithPlaceholders()
{
$placeholders = [];
$timer1 = microtime(true);
$sql = 'INSERT INTO `name` (`first_name`, `last_name`) VALUES ';
foreach (\array_slice($this->data, 1) as $values) {
$placeholders[] = '(?, ?)';
$arr[] = $values[1];
$arr[] = $values[2];
}
$sql .= implode(', ', $placeholders);
$sth = $this->pdo->prepare($sql);
$sth->execute($arr);
$timer2 = microtime(true);
return $timer2 - $timer1;
}
public function singleTransactionInsertWithRollback()
{
$placeholders = [];
$timer1 = microtime(true);
$sql = 'INSERT INTO `name` (`first_name`, `last_name`) VALUES ';
foreach (\array_slice($this->data, 1) as $values) {
$placeholders[] = '(?, ?)';
$arr[] = $values[1];
$arr[] = $values[2];
}
$sql .= implode(', ', $placeholders);
$this->pdo->beginTransaction();
$sth = $this->pdo->prepare($sql);
$sth->execute($arr);
$this->pdo->commit();
$timer2 = microtime(true);
return $timer2 - $timer1;
}
}
$s = new SpeedTestClass();
$s->createData();
$s->truncateTable();
echo "Time Spent for singleTransactionInsertWithRollback: {$s->singleTransactionInsertWithRollback()}".PHP_EOL;
$s->truncateTable();
echo "Time Spent for single Transaction Insert: {$s->singleTransactionInsert()}".PHP_EOL;
$s->truncateTable();
echo "Time Spent for single Transaction Insert With Placeholders: {$s->singleTransactionInsertWithPlaceholders()}".PHP_EOL;
$s->truncateTable();
echo "Time Spent for transaction: {$s->transactionSpeed()}".PHP_EOL;
$s->truncateTable();
echo "Time Spent for AutoCommit: {$s->noBindAutoCommitSpeed()}".PHP_EOL;
$s->truncateTable();
echo "Time Spent for autocommit with bind: {$s->autoCommitSpeed()}".PHP_EOL;
$s->truncateTable();
The results for 100,000 entries for a table containing only two columns is as below
$ php data.php
Time Spent for singleTransactionInsertWithRollback: 0.75147604942322
Time Spent for single Transaction Insert: 0.67445182800293
Time Spent for single Transaction Insert With Placeholders: 0.71131205558777
Time Spent for transaction: 8.0056409835815
Time Spent for AutoCommit: 35.4979159832
Time Spent for autocommit with bind: 33.303519010544
what about something like this:
if(count($types_of_values)>0){
$uid = 1;
$x = 0;
$sql = "";
$values = array();
foreach($types_of_values as $k=>$v){
$sql .= "(:id_$k,:kind_of_val_$k), ";
$values[":id_$k"] = $uid;
$values[":kind_of_val_$k"] = $v;
}
$sql = substr($sql,0,-2);
$query = "INSERT INTO table (id,value_type) VALUES $sql";
$res = $this->db->prepare($query);
$res->execute($values);
}
The idea behind this is to cycle through your array values, adding "id numbers" to each loop for your prepared statement placeholders while at the same time, you add the values to your array for the binding parameters. If you don't like using the "key" index from the array, you could add $i=0, and $i++ inside the loop. Either works in this example, even if you have associative arrays with named keys, it would still work providing the keys were unique. With a little work it would be fine for nested arrays too..
**Note that substr strips the $sql variables last space and comma, if you don't have a space you'd need to change this to -1 rather than -2.
Although an old question all the contributions helped me a lot so here's my solution, which works within my own DbContext
class. The $rows
parameter is simply an array of associative arrays representing rows or models: field name => insert value
.
If you use a pattern that uses models this fits in nicely when passed model data as an array, say from a ToRowArray
method within the model class.
Note: It should go without saying but never allow the arguments passed to this method to be exposed to the user or reliant on any user input, other than the insert values, which have been validated and sanitised. The
$tableName
argument and the column names should be defined by the calling logic; for instance aUser
model could be mapped to the user table, which has its column list mapped to the model's member fields.
public function InsertRange($tableName, $rows)
{
// Get column list
$columnList = array_keys($rows[0]);
$numColumns = count($columnList);
$columnListString = implode(",", $columnList);
// Generate pdo param placeholders
$placeHolders = array();
foreach($rows as $row)
{
$temp = array();
for($i = 0; $i < count($row); $i++)
$temp[] = "?";
$placeHolders[] = "(" . implode(",", $temp) . ")";
}
$placeHolders = implode(",", $placeHolders);
// Construct the query
$sql = "insert into $tableName ($columnListString) values $placeHolders";
$stmt = $this->pdo->prepare($sql);
$j = 1;
foreach($rows as $row)
{
for($i = 0; $i < $numColumns; $i++)
{
$stmt->bindParam($j, $row[$columnList[$i]]);
$j++;
}
}
$stmt->execute();
}
Multiple Values Insert with PDO Prepared Statements
Inserting multiple values in one execute statement. Why because according to this page it is faster than regular inserts.
$datafields = array('fielda', 'fieldb', ... );
$data[] = array('fielda' => 'value', 'fieldb' => 'value' ....);
$data[] = array('fielda' => 'value', 'fieldb' => 'value' ....);
more data values or you probably have a loop that populates data.
With prepared inserts you need to know the fields you're inserting to, and the number of fields to create the ? placeholders to bind your parameters.
insert into table (fielda, fieldb, ... ) values (?,?...), (?,?...)....
That is basically how we want the insert statement to look like.
Now, the code:
function placeholders($text, $count=0, $separator=","){
$result = array();
if($count > 0){
for($x=0; $x<$count; $x++){
$result[] = $text;
}
}
return implode($separator, $result);
}
$pdo->beginTransaction(); // also helps speed up your inserts.
$insert_values = array();
foreach($data as $d){
$question_marks[] = '(' . placeholders('?', sizeof($d)) . ')';
$insert_values = array_merge($insert_values, array_values($d));
}
$sql = "INSERT INTO table (" . implode(",", $datafields ) . ") VALUES " .
implode(',', $question_marks);
$stmt = $pdo->prepare ($sql);
try {
$stmt->execute($insert_values);
} catch (PDOException $e){
echo $e->getMessage();
}
$pdo->commit();
Although in my test, there was only a 1 sec difference when using multiple inserts and regular prepared inserts with single value.