I have a PHP script that has a runtime of 34 seconds. But it dies after 30 seconds. I guess my webhost a time limit of 30 seconds.
I am thinking of splitting the script
You should use the set_time_limit() function, it helps in most cases.
Alternatively, on Linux/Unix, you can try running the script as a background process. PHP CLI can be used for this purpose, the scripts running via CLI have no time limit. You can use exec/system or similar PHP functions to fireup the PHP CLI and have it run a PHP script in background, returning control to the script immediately. In most cases, a PHP script running via CLI behaves just like it would do in CGI environment except for few environment related differences, such as no time limit.
Here is one way to do it:
exec("/usr/bin/php script.php > /dev/null &");
^ ^ ^ ^
| | | |
| | | +-- run the specified process in background
| | +-------------- redirect script output to nothing
| +------------------------- your time consuming script
+-------------------------------------- path to PHP CLI (not PHP CGI)
More details at: Running a Background Process in PHP
Running it as CLI will automatically eliminate time limit. You can use cron for that as described by Salman A. I have script that run for every 30 minutes. It does like this:
<?php
$timeLimit = 1740; //29 mins
$startTime = time();
do
{
$dhandle = opendir("/some/dir/to/process");
while ( (false !== ($file = readdir($dhandle))) ) {
//do process.
}
sleep(30);
} while (!IsShouldStop($startTime, $timeLimit));
function IsShouldStop($startTime, $timeLimit)
{
$now = time();
$min = intval(date("i"));
if ( ($now > $startTime + $timeLimit) && ($min >= 29 || $min >= 59) )
{
echo "STOPPING...<br />\n";
return true;
}
else
{
return false;
}
}
?>
Why I did that? because I've read somewhere that PHP is very bad at garbage collection. Therefore I kill it every 30 mins. It's not that robust. But, given the constraint of shared hosting. This is my best approach. You can use that as template.
Have a look at set_time_limit().