I have one php script, and I am executing this script via cron every 10 minutes on CentOS.
The problem is that if the cron job will take more than 10 minutes, then a
This is a very common problem with a very simple solution: cronjoblock
a simple 8-lines shellscript wrapper applies locking using flock:
https://gist.github.com/coderofsalvation/1102e56d3d4dcbb1e36f
btw.
cronjoblock
also reverses cron's spammy emailbehaviour: only output something if stuff goes wrong. This is handy in respect to cron's MAILTO variable. The stdout/stderr output will be suppressed (so cron will not send mails) unless the given process has an exitcode > 0
Advisory locking is made for exactly this purpose.
You can accomplish advisory locking with flock(). Simply apply the function to a previously opened lock file to determine if another script has a lock on it.
$f = fopen('lock', 'w') or die ('Cannot create lock file');
if (flock($f, LOCK_EX | LOCK_NB)) {
// yay
}
In this case I'm adding LOCK_NB
to prevent the next script from waiting until the first has finished. Since you're using cron there will always be a next script.
If the current script prematurely terminates, any file locks will get released by the OS.
I use this ::
<?php
// Create a PID file
if (is_file (dirname ($_SERVER['SCRIPT_NAME']) . "/.processing")) { die (); }
file_put_contents (dirname ($_SERVER['SCRIPT_NAME']) . "/.processing", "processing");
// SCRIPT CONTENTS GOES HERE //
@unlink (dirname ($_SERVER['SCRIPT_NAME']) . "/.processing");
?>
I was running a php cron job script that dealt specifically with sending text messages using an existing API. On my local box the cron job was working fine, but on my customer's box it was sending double messages. Although this doesn't make sense to me, I double checked the permissions for the folder responsible for sending messages and the permission was set to root. Once I set the owner as www-data (Ubuntu) it started behaving normally.
This might mot be the issue for you, but if its a simple cron script I would double check the permissions.
#!/bin/bash
ps -ef | grep -v grep | grep capture_12hz_sampling_track.php
if [ $? -eq 1 ];
then
nohup /usr/local/bin/php /opt/Apache/htdocs/cmsmusic_v2/script/Mp3DownloadProcessMp4/capture_12hz_sampling_track.php &
else
echo "Already running"
fi
flock()
worked out great for me - I have a cron job with database requests scheduled every 5 minutes, so not having several running at the same time is crucial. This is what I did:
$filehandle = fopen("lock.txt", "c+");
if (flock($filehandle, LOCK_EX | LOCK_NB)) {
// code here to start the cron job
flock($filehandle, LOCK_UN); // don't forget to release the lock
} else {
// throw an exception here to stop the next cron job
}
fclose($filehandle);
In case you don't want to kill the next scheduled cron job, but simply pause it till the running one is finished, then just omit the LOCK_NB
:
if (flock($filehandle, LOCK_EX))