How to make asynchronous HTTP requests in PHP

后端 未结 18 2032
梦如初夏
梦如初夏 2020-11-22 02:13

Is there a way in PHP to make asynchronous HTTP calls? I don\'t care about the response, I just want to do something like file_get_contents(), but not wait for

相关标签:
18条回答
  • 2020-11-22 02:45

    The answer I'd previously accepted didn't work. It still waited for responses. This does work though, taken from How do I make an asynchronous GET request in PHP?

    function post_without_wait($url, $params)
    {
        foreach ($params as $key => &$val) {
          if (is_array($val)) $val = implode(',', $val);
            $post_params[] = $key.'='.urlencode($val);
        }
        $post_string = implode('&', $post_params);
    
        $parts=parse_url($url);
    
        $fp = fsockopen($parts['host'],
            isset($parts['port'])?$parts['port']:80,
            $errno, $errstr, 30);
    
        $out = "POST ".$parts['path']." HTTP/1.1\r\n";
        $out.= "Host: ".$parts['host']."\r\n";
        $out.= "Content-Type: application/x-www-form-urlencoded\r\n";
        $out.= "Content-Length: ".strlen($post_string)."\r\n";
        $out.= "Connection: Close\r\n\r\n";
        if (isset($post_string)) $out.= $post_string;
    
        fwrite($fp, $out);
        fclose($fp);
    }
    
    0 讨论(0)
  • 2020-11-22 02:45

    The swoole extension. https://github.com/matyhtf/swoole Asynchronous & concurrent networking framework for PHP.

    $client = new swoole_client(SWOOLE_SOCK_TCP, SWOOLE_SOCK_ASYNC);
    
    $client->on("connect", function($cli) {
        $cli->send("hello world\n");
    });
    
    $client->on("receive", function($cli, $data){
        echo "Receive: $data\n";
    });
    
    $client->on("error", function($cli){
        echo "connect fail\n";
    });
    
    $client->on("close", function($cli){
        echo "close\n";
    });
    
    $client->connect('127.0.0.1', 9501, 0.5);
    
    0 讨论(0)
  • 2020-11-22 02:50

    You can use this library: https://github.com/stil/curl-easy

    It's pretty straightforward then:

    <?php
    $request = new cURL\Request('http://yahoo.com/');
    $request->getOptions()->set(CURLOPT_RETURNTRANSFER, true);
    
    // Specify function to be called when your request is complete
    $request->addListener('complete', function (cURL\Event $event) {
        $response = $event->response;
        $httpCode = $response->getInfo(CURLINFO_HTTP_CODE);
        $html = $response->getContent();
        echo "\nDone.\n";
    });
    
    // Loop below will run as long as request is processed
    $timeStart = microtime(true);
    while ($request->socketPerform()) {
        printf("Running time: %dms    \r", (microtime(true) - $timeStart)*1000);
        // Here you can do anything else, while your request is in progress
    }
    

    Below you can see console output of above example. It will display simple live clock indicating how much time request is running:


    animation

    0 讨论(0)
  • 2020-11-22 02:51

    As of 2018, Guzzle has become the defacto standard library for HTTP requests, used in several modern frameworks. It's written in pure PHP and does not require installing any custom extensions.

    It can do asynchronous HTTP calls very nicely, and even pool them such as when you need to make 100 HTTP calls, but don't want to run more than 5 at a time.

    Concurrent request example

    use GuzzleHttp\Client;
    use GuzzleHttp\Promise;
    
    $client = new Client(['base_uri' => 'http://httpbin.org/']);
    
    // Initiate each request but do not block
    $promises = [
        'image' => $client->getAsync('/image'),
        'png'   => $client->getAsync('/image/png'),
        'jpeg'  => $client->getAsync('/image/jpeg'),
        'webp'  => $client->getAsync('/image/webp')
    ];
    
    // Wait on all of the requests to complete. Throws a ConnectException
    // if any of the requests fail
    $results = Promise\unwrap($promises);
    
    // Wait for the requests to complete, even if some of them fail
    $results = Promise\settle($promises)->wait();
    
    // You can access each result using the key provided to the unwrap
    // function.
    echo $results['image']['value']->getHeader('Content-Length')[0]
    echo $results['png']['value']->getHeader('Content-Length')[0]
    

    See http://docs.guzzlephp.org/en/stable/quickstart.html#concurrent-requests

    0 讨论(0)
  • 2020-11-22 02:51
    1. Fake a request abortion using CURL setting a low CURLOPT_TIMEOUT_MS

    2. set ignore_user_abort(true) to keep processing after the connection closed.

    With this method no need to implement connection handling via headers and buffer too dependent on OS, Browser and PHP version

    Master process

    function async_curl($background_process=''){
    
        //-------------get curl contents----------------
    
        $ch = curl_init($background_process);
        curl_setopt_array($ch, array(
            CURLOPT_HEADER => 0,
            CURLOPT_RETURNTRANSFER =>true,
            CURLOPT_NOSIGNAL => 1, //to timeout immediately if the value is < 1000 ms
            CURLOPT_TIMEOUT_MS => 50, //The maximum number of mseconds to allow cURL functions to execute
            CURLOPT_VERBOSE => 1,
            CURLOPT_HEADER => 1
        ));
        $out = curl_exec($ch);
    
        //-------------parse curl contents----------------
    
        //$header_size = curl_getinfo($ch, CURLINFO_HEADER_SIZE);
        //$header = substr($out, 0, $header_size);
        //$body = substr($out, $header_size);
    
        curl_close($ch);
    
        return true;
    }
    
    async_curl('http://example.com/background_process_1.php');
    

    Background process

    ignore_user_abort(true);
    
    //do something...
    

    NB

    If you want cURL to timeout in less than one second, you can use CURLOPT_TIMEOUT_MS, although there is a bug/"feature" on "Unix-like systems" that causes libcurl to timeout immediately if the value is < 1000 ms with the error "cURL Error (28): Timeout was reached". The explanation for this behavior is:

    [...]

    The solution is to disable signals using CURLOPT_NOSIGNAL

    Resources

    • curl timeout less than 1000ms always fails?

    • http://www.php.net/manual/en/function.curl-setopt.php#104597

    • http://php.net/manual/en/features.connection-handling.php

    0 讨论(0)
  • 2020-11-22 02:53
    class async_file_get_contents extends Thread{
        public $ret;
        public $url;
        public $finished;
            public function __construct($url) {
            $this->finished=false;
            $this->url=$url;
        }
            public function run() {
            $this->ret=file_get_contents($this->url);
            $this->finished=true;
        }
    }
    $afgc=new async_file_get_contents("http://example.org/file.ext");
    
    0 讨论(0)
提交回复
热议问题