I have 24 databases with a table labeled email_queue
.
I have another database with a list of all the databases that have the email_queue
table
On way to do this is with curl_multi_open
Split your script into two, you can make one php file (say email_out.php) take the db name (or some variable that's used to look up the db name, either the switch will be in the for loop or in email_out.php), and then do the mass email based of that one script.
the second part uses curl_multi_open to open the email_out.php script multiple times, effectively creating multiple separate connections to different db's, the scripts can all resolve at different times since they are all running in parallel. Essentially, your loop is now adding the script to curl_multi_open multiple times with different arguments and then executing all of them asynchronously.
class Fork
{
private $_handles = array();
private $_mh = array();
function __construct()
{
$this->_mh = curl_multi_init();
}
function add($url)
{
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_TIMEOUT, 30);
curl_multi_add_handle($this->_mh, $ch);
$this->_handles[] = $ch;
return $this;
}
function run()
{
$running=null;
do {
curl_multi_exec($this->_mh, $running);
usleep (250000);
} while ($running > 0);
for($i=0; $i < count($this->_handles); $i++) {
$out = curl_multi_getcontent($this->_handles[$i]);
$data[$i] = json_decode($out);
curl_multi_remove_handle($this->_mh, $this->_handles[$i]);
}
curl_multi_close($this->_mh);
return $data;
}
}
(from http://gonzalo123.com/2010/10/11/speed-up-php-scripts-with-asynchronous-database-queries/)
So your loop would look something like this:
$fork = new Fork;
for ($i = 0; $i < 24; $i++) {
$fork->add("email_out.php?t=" . $i);
}
$fork->run();