问题
I am encountering a strange problem, and most likely I do not understand something correctly. I have a website, which uses ajax requests to load different pages, to make it more application like.
I've noticed that the page load times, are rather high (1-3 seconds), and wanted to benchmark it, to notice where the problem is coming from. I was using Firefox's Developer Tools, and the Networking tab. When I clicked on my link, to load the page with the ajax request, I was checking the developer tools. Developer tools gave me the following timings on this request:
Blocked 334ms, Connect 162ms, TLS 170ms, Wait 1183ms, Total 1860ms
To my understanding, this means that, it took 334ms for the request to being sent from the browser, because of other concurrent requests, it took 162ms to connect to the server, another 170ms went by for handshake and auth, and the response on the server was generated and sent in 1183ms. Are these assumptions true?
Next, I've implemented a small timer in my PHP script, by placing the following code at the very beginning of the .php
file executed, and the very end, respectively:
// beginning
$start = microtime(true);
// end
$end = microtime(true) - $start;
Next, I've outputed the $end
variable, to see, how many seconds does my code run in, and the result was 0.356235252352
.
So what I gather from this, my script run in ~0.4 seconds, but if this is true, where did the rest of the wait time, the 1.1s go?
EDIT: Okay, so we nailed it down to something with PHP or Apache, but I don't know what could it be. When requesting php scripts on the server, we receive a first-byte-time of 7-800 ms, while with any other script file, or html page, we get a 60-70ms first-byte time.:(
来源:https://stackoverflow.com/questions/48911665/benchmarking-runtime-of-ajax-requests