Echoing content sometimes takes a very long time

前端 未结 8 1994
眼角桃花
眼角桃花 2021-02-19 16:20

I have a script that builds my webpage in one string ($content) and then echo it to the user.

My script looks like this:

$time1= microtime(true);
$conten         


        
8条回答
  •  无人共我
    2021-02-19 16:36

    From http://wonko.com/post/seeing_poor_performance_using_phps_echo_statement_heres_why

    This old bug report may shed some light. In short, using echo to send large strings to the browser results in horrid performance due to the way Nagle’s Algorithm causes data to be buffered for transmission over TCP/IP.

    The solution? A simple three-line function that splits large strings into smaller chunks before echoing them:

    function echobig($string, $bufferSize = 8192) { 
        $splitString = str_split($string, $bufferSize);
    
        foreach($splitString as $chunk) { echo $chunk; }
    }
    

    Play around with the buffer size and see what works best for you. I found that 8192, apart from being a nice round number, seemed to be a good size. Certain other values work too, but I wasn’t able to discern a pattern after several minutes of tinkering and there’s obviously some math at work that I have no desire to try to figure out.

    By the way, the performance hit also happens when using PHP’s output control functions (ob_start() and friends)

    Following the OPs comment that he's tried this I also found the following on PHP.net suggesting that str_split can also be a waste of resources and the echobig function can be optimised further by using the following code:

    function echobig($string, $bufferSize = 8192) {
      // suggest doing a test for Integer & positive bufferSize
      for ($chars=strlen($string)-1,$start=0;$start <= $chars;$start += $bufferSize) {
        echo substr($string,$start,$buffer_size);
      }
    }
    

    Have you tried running your script using the CLI rather than through Apache?

提交回复
热议问题