I have a script that builds my webpage in one string ($content) and then echo it to the user.
My script looks like this:
$time1= microtime(true);
$conten
My guess is that the act of accessing that large of a string takes up a decent amount of memory over multiple uses. Because PHP is garbage collected, memory is taken up until the garbage collecter is ran, then it's freed. My guess is that multiple requests to store content in a string variable is causing the rapid filling up of volatile memory (RAM). Then a few times a day you start hitting the limit causing the slower load times. Garbage collecter hits, and everything is back to normal.
Let's narrow the issue down and factor out some things...
In the question you indicate you're echoing out 10-15kb. That's a significant amount no matter how it's buffered to output--remember php is single thread, once you flush your buffer you got to wait for all the output to happen via the shell or HTTP before the script continues. It will eventually have to flush the internal buffer before continuing the echo. To get good time without the flushing overhead of echo
Try replacing the
$time = microtime(true);
echo $content;
$echo_time = (microtime(true)-$time);
With
ob_start();
$time = microtime(true);
echo $content;
$echo_time = (microtime(true)-$time);
ob_clean();
This will echo to a buffer, but not actually spit it out via HTTP or whatever. That should give you the 'real' time of the echo command without any concern sending out what's in the buffer.
If echo_time shrinks down, you have a transport issue to address as best you can with buffering.
If echo_time is still to large, you'll need to start digging into the PHP C code.
Either way you're a lot closer to finding your issue and a solution
If this is dedicated server - please login to console and see which process uses a lot of cpu time when you generating content. Its very hard to tell, when we cant see code. Maybe you just need some indexes in database, or maybe you should remove some indexes.
You can also check httpd and mysqld log files.
Do you have any while() or for() loops in your script? If so, you should check if these values aren't conflicting with anything, Occasionally I forgot about these myself and my script would also run for about 30 seconds.
From http://wonko.com/post/seeing_poor_performance_using_phps_echo_statement_heres_why
This old bug report may shed some light. In short, using echo to send large strings to the browser results in horrid performance due to the way Nagle’s Algorithm causes data to be buffered for transmission over TCP/IP.
The solution? A simple three-line function that splits large strings into smaller chunks before echoing them:
function echobig($string, $bufferSize = 8192) {
$splitString = str_split($string, $bufferSize);
foreach($splitString as $chunk) { echo $chunk; }
}
Play around with the buffer size and see what works best for you. I found that 8192, apart from being a nice round number, seemed to be a good size. Certain other values work too, but I wasn’t able to discern a pattern after several minutes of tinkering and there’s obviously some math at work that I have no desire to try to figure out.
By the way, the performance hit also happens when using PHP’s output control functions (ob_start() and friends)
Following the OPs comment that he's tried this I also found the following on PHP.net suggesting that str_split can also be a waste of resources and the echobig function can be optimised further by using the following code:
function echobig($string, $bufferSize = 8192) {
// suggest doing a test for Integer & positive bufferSize
for ($chars=strlen($string)-1,$start=0;$start <= $chars;$start += $bufferSize) {
echo substr($string,$start,$buffer_size);
}
}
Have you tried running your script using the CLI rather than through Apache?
I had the same problem in the past very similar to yours. I found that this problem can be caused by slow clients. If client fetched half of the page and then hangs, php will wait until client is ready and then sends rest of the content. So it could be not problem on your side.
You can try following scripts on your server to check this. This script put on your server and call it echo.php:
<?php
$time_start = time();
echo str_repeat("a", 200000);
echo "\nThis script took: " . (time() - $time_start) . " sec";
Then fetch it with this script (change example.com to your domain):
<?php
$fp = fsockopen("example.com", 80, $errno, $errstr, 30);
if (!$fp) {
echo "$errstr ($errno)<br />\n";
} else {
$out = "GET /echo.php HTTP/1.1\r\n";
$out .= "Host: example.com\r\n";
$out .= "Connection: Close\r\n\r\n";
fwrite($fp, $out);
while (!feof($fp)) {
echo fgets($fp, 5000);
sleep(1);
}
fclose($fp);
}
I've got echo.php
running 27 seconds. When I remove line sleep(1)
, echo.php
takes only 2 seconds to run.