Is there a way to timeout a function? I have 10 minutes to perform a job. The job includes a for loop, here is an example:
Before starting into the loop, set a variable that remembers the time()
. At the end of each iteration, check if the current time()
is more than 60 seconds greater. If so, break
. For instance:
<?php
$startTime=time();
foreach($arr as $key => $value){
some_function($key, $value); //This function does SSH and SFTP stuff
if (time()-$startTime>60) break; //change 60 to the max time, in seconds.
}
?>
It depends on your implementation. 99% of the functions in PHP are blocking. Meaning processing will not continue until the current function has completed. However, if the function contains a loop you can add in your own code to interrupt the loop after a certain condition has been met.
Something like this:
foreach ($array as $value) {
perform_task($value);
}
function perform_task($value) {
$start_time = time();
while(true) {
if ((time() - $start_time) > 300) {
return false; // timeout, function took longer than 300 seconds
}
// Other processing
}
}
Another example where interrupting the processing IS NOT possible:
foreach ($array as $value) {
perform_task($value);
}
function perform_task($value) {
// preg_replace is a blocking function
// There's no way to break out of it after a certain amount of time.
return preg_replace('/pattern/', 'replace', $value);
}
You could try implementing this using 'curl'. Curl is -- from PHP -- mostly used to do HTTP/FTP stuff, but it supports many more protocols, including ssh/sftp. I've never done something ssh/sftp related using curl, so i can not give any additional advice, but you should be able to find additional information here on stackoverflow or somewhere else.
There is an option "CURLOPT_TIMEOUT" you could use to configure the timeout for your request (there's also an option "CURLOPT_CONNECTTIMEOUT"). You can even specify the timeouts in millisecond resolution (CURLOPT_TIMEOUT_MS / CURLOPT_CONNECTTIMEOUT_MS).
Anyway: for a cronjob i would recommend to use an additional "lock-file" your script writes when it's started and removes, when it's finished running. You can check for this lock-file within your script so if cron triggers your script before the last run finished, you can just exit your script without any further doing. That ensures, that your script will not be executed multiple times in parallel.
You can find more on the CURL options and CURL itself in the PHP documentation, too:
You have to make sure, that your installed libcurl and/or curl for php supports SSH/SFTP, though:
etc.
Hope that helps ...
Take a look at set_time_limit. Now the function itself isn't really want you want as it only takes into consideration execution time, meaning 5 seconds in real life might still be only 0.2 seconds execution. But the reason for looking at the link is for the comments. There's a couple solutions that users have posted.
PHP is single threaded... you have to use your OS's ability to fork another process.
The only way I know how to do this is with the exec command to fork off another full process. Put the stuff you want timed in a separate script, call exec on the script, then immediately sleep for 10 min. If at 10 min the process is still running kill it. (you should have it's PID by making the command run the new php code in the background, getting it via command line and returning it from the exec command).
For those of you who are in command line, see how sleep is interrupted:
echo date("H:i:s\n");
pcntl_signal(SIGALRM ,function($signal){
echo "arghhh! i got interrupted \n";
},true);
pcntl_alarm(3);
sleep(20);
pcntl_signal_dispatch();
echo date("H:i:s\n");
pcntl_alarm(3);
sleep(20);
echo date("H:i:s\n");
pcntl_signal_dispatch();
echo "end\n";