Run PHP Task Asynchronously

后端 未结 15 1089
青春惊慌失措
青春惊慌失措 2020-11-22 15:15

I work on a somewhat large web application, and the backend is mostly in PHP. There are several places in the code where I need to complete some task, but I don\'t want to m

相关标签:
15条回答
  • 2020-11-22 15:23

    This is the same method I have been using for a couple of years now and I haven't seen or found anything better. As people have said, PHP is single threaded, so there isn't much else you can do.

    I have actually added one extra level to this and that's getting and storing the process id. This allows me to redirect to another page and have the user sit on that page, using AJAX to check if the process is complete (process id no longer exists). This is useful for cases where the length of the script would cause the browser to timeout, but the user needs to wait for that script to complete before the next step. (In my case it was processing large ZIP files with CSV like files that add up to 30 000 records to the database after which the user needs to confirm some information.)

    I have also used a similar process for report generation. I'm not sure I'd use "background processing" for something such as an email, unless there is a real problem with a slow SMTP. Instead I might use a table as a queue and then have a process that runs every minute to send the emails within the queue. You would need to be warry of sending emails twice or other similar problems. I would consider a similar queueing process for other tasks as well.

    0 讨论(0)
  • 2020-11-22 15:25

    It's a great idea to use cURL as suggested by rojoca.

    Here is an example. You can monitor text.txt while the script is running in background:

    <?php
    
    function doCurl($begin)
    {
        echo "Do curl<br />\n";
        $url = 'http://'.$_SERVER['SERVER_NAME'].$_SERVER['REQUEST_URI'];
        $url = preg_replace('/\?.*/', '', $url);
        $url .= '?begin='.$begin;
        echo 'URL: '.$url.'<br>';
        $ch = curl_init();
        curl_setopt($ch, CURLOPT_URL, $url);
        curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
        $result = curl_exec($ch);
        echo 'Result: '.$result.'<br>';
        curl_close($ch);
    }
    
    
    if (empty($_GET['begin'])) {
        doCurl(1);
    }
    else {
        while (ob_get_level())
            ob_end_clean();
        header('Connection: close');
        ignore_user_abort();
        ob_start();
        echo 'Connection Closed';
        $size = ob_get_length();
        header("Content-Length: $size");
        ob_end_flush();
        flush();
    
        $begin = $_GET['begin'];
        $fp = fopen("text.txt", "w");
        fprintf($fp, "begin: %d\n", $begin);
        for ($i = 0; $i < 15; $i++) {
            sleep(1);
            fprintf($fp, "i: %d\n", $i);
        }
        fclose($fp);
        if ($begin < 10)
            doCurl($begin + 1);
    }
    
    ?>
    
    0 讨论(0)
  • 2020-11-22 15:25

    Spawning new processes on the server using exec() or directly on another server using curl doesn't scale all that well at all, if we go for exec you are basically filling your server with long running processes which can be handled by other non web facing servers, and using curl ties up another server unless you build in some sort of load balancing.

    I have used Gearman in a few situations and I find it better for this sort of use case. I can use a single job queue server to basically handle queuing of all the jobs needing to be done by the server and spin up worker servers, each of which can run as many instances of the worker process as needed, and scale up the number of worker servers as needed and spin them down when not needed. It also let's me shut down the worker processes entirely when needed and queues the jobs up until the workers come back online.

    0 讨论(0)
  • 2020-11-22 15:26

    PHP HAS multithreading, its just not enabled by default, there is an extension called pthreads which does exactly that. You'll need php compiled with ZTS though. (Thread Safe) Links:

    Examples

    Another tutorial

    pthreads PECL Extension

    0 讨论(0)
  • 2020-11-22 15:27

    Here is a simple class I coded for my web application. It allows for forking PHP scripts and other scripts. Works on UNIX and Windows.

    class BackgroundProcess {
        static function open($exec, $cwd = null) {
            if (!is_string($cwd)) {
                $cwd = @getcwd();
            }
    
            @chdir($cwd);
    
            if (strtoupper(substr(PHP_OS, 0, 3)) == 'WIN') {
                $WshShell = new COM("WScript.Shell");
                $WshShell->CurrentDirectory = str_replace('/', '\\', $cwd);
                $WshShell->Run($exec, 0, false);
            } else {
                exec($exec . " > /dev/null 2>&1 &");
            }
        }
    
        static function fork($phpScript, $phpExec = null) {
            $cwd = dirname($phpScript);
    
            @putenv("PHP_FORCECLI=true");
    
            if (!is_string($phpExec) || !file_exists($phpExec)) {
                if (strtoupper(substr(PHP_OS, 0, 3)) == 'WIN') {
                    $phpExec = str_replace('/', '\\', dirname(ini_get('extension_dir'))) . '\php.exe';
    
                    if (@file_exists($phpExec)) {
                        BackgroundProcess::open(escapeshellarg($phpExec) . " " . escapeshellarg($phpScript), $cwd);
                    }
                } else {
                    $phpExec = exec("which php-cli");
    
                    if ($phpExec[0] != '/') {
                        $phpExec = exec("which php");
                    }
    
                    if ($phpExec[0] == '/') {
                        BackgroundProcess::open(escapeshellarg($phpExec) . " " . escapeshellarg($phpScript), $cwd);
                    }
                }
            } else {
                if (strtoupper(substr(PHP_OS, 0, 3)) == 'WIN') {
                    $phpExec = str_replace('/', '\\', $phpExec);
                }
    
                BackgroundProcess::open(escapeshellarg($phpExec) . " " . escapeshellarg($phpScript), $cwd);
            }
        }
    }
    
    0 讨论(0)
  • 2020-11-22 15:29

    I've used the queuing approach, and it works well as you can defer that processing until your server load is idle, letting you manage your load quite effectively if you can partition off "tasks which aren't urgent" easily.

    Rolling your own isn't too tricky, here's a few other options to check out:

    • GearMan - this answer was written in 2009, and since then GearMan looks a popular option, see comments below.
    • ActiveMQ if you want a full blown open source message queue.
    • ZeroMQ - this is a pretty cool socket library which makes it easy to write distributed code without having to worry too much about the socket programming itself. You could use it for message queuing on a single host - you would simply have your webapp push something to a queue that a continuously running console app would consume at the next suitable opportunity
    • beanstalkd - only found this one while writing this answer, but looks interesting
    • dropr is a PHP based message queue project, but hasn't been actively maintained since Sep 2010
    • php-enqueue is a recently (2017) maintained wrapper around a variety of queue systems
    • Finally, a blog post about using memcached for message queuing

    Another, perhaps simpler, approach is to use ignore_user_abort - once you've sent the page to the user, you can do your final processing without fear of premature termination, though this does have the effect of appearing to prolong the page load from the user perspective.

    0 讨论(0)
提交回复
热议问题