My code looks like this:
// Connect to SQLite DB
DB(\'/path/to/sqlite.db\');
DB(\'BEGIN TRANSACTION;\');
// These loops are just examples.
for ($i = 1; $i
Unfortunately I think that the problem may be with the structure of your code.
In your loop of INSERT statements, the statements are all identical and there is no need to call $db->prepare each time. The idea behind prepared statements is that you call $db->prepare() once, and execute() can be called multiple times on the same statement object. You're calling $db->prepare() every time, which is causing overhead in parsing the SQL statement and creating a new object.
Consider re-writing your DB() function like this:
function do_query($db, $pdo_statement, $query, $args)
{
if ($pdo_statement->execute($args) === true)
{
if (stripos($query, 'INSERT') === 0)
{
return $db->lastInsertId();
}
if (stripos($query, 'SELECT') === 0)
{
return $result->fetchAll(PDO::FETCH_ASSOC);
}
return $result->rowCount();
}
}
function DB($query)
{
static $db = null;
if (is_file($query) === true)
{
$db = new PDO('sqlite:' . $query, null, null, array(PDO::ATTR_ERRMODE => PDO::ERRMODE_WARNING));
}
else if (is_a($db, 'PDO') === true)
{
$result = $db->prepare($query);
if (is_a($result, 'PDOStatement') === true)
{
$args = func_get_args();
if (is_array($args[1])) {
$ret = array();
foreach ($args[1] as $args) {
$ret[] = do_query($db, $query, $result, $args);
}
return $ret;
}
return do_query($db, $query, $result, array_slice(func_get_args(), 1));
}
return false;
}
return true;
}
So if you want to run the same query with lots of values, you create two-dimensional array of the values you want to insert, and call DB('INSERT INTO....', $values)
. The DB() function checks to see if the second parameter to the function (after $query) is an array and if so it loops through running $query against the values in the array. This way, the loop doesn't involve re-preparing the SQL statement each time, just re-executing it with different values. The return value of the function will be an array of the results of each query.
The DB function as originally posted issues a filesystem stat() system call each time it is run to check if the query string is a file. While that isn't solely responsible for the slow execution, it contributes to it.
If you are inserting lots of data into a table, try inserting the data in one query.
$query = 'INSERT INTO "test" ("id", "name") VALUES ';
$data = array();
for ($i = 1; $i <= 10000; $i++) {
for ($j = 1; $j <= 100; $j++) {
$query .= '(?,?),';
$data[] = $i;
$data[] = 'Testing '.$j;
}
}
$query = substr($query, 0, -1);
DB($query, $data);
This should eliminate the overhead with single insert queries. There's a limit on the query lenght though, if you have troubles with query lenght, try issueing DB() calls more often in for loop.
Why do you use prepared statements if you "prepare" them in the loop ? (in the DB function)
Make a prepare befor the loop, and in the loop just give the values and execute. Of course this would require a rewrite of your DB method.
Oh and btw. is your ID column the primary key ? if so you have another problem couse "i" would be for 100 "j" the same :)
For example:
$sth = $dbh->prepare('INSERT INTO "test" ("id", "name") VALUES (:id, :name)');
$j=0;
for ($i = 1; $i <= 10000; $i++){
$j = ($j==100) ? 0 : $j++;
$sth->execute(array(':id' => $i, ':name' => 'Testing ' . $j));
}