I\'ve read that Firefox 3.5 has a new feature in its parser ?
Improvements to the Gecko layout engine, including speculative parsing for faster conten
It's all to do with this entry in bugzilla: https://bugzilla.mozilla.org/show_bug.cgi?id=364315
In that entry, Anders Holbøll suggested:
It seems that when encountering a script-tag, that references an external file, the browser does not attempt to load any elements after the script-tag until the external script files is loaded. This makes sites, that references several or large javascript files, slow.
...
Here file1.js will be loaded first, followed sequentially by file2.js. Then img1.gif, img2.gif and file3.js will be loaded concurrently. When file3.js has loaded completely, img3.gif will be loaded.One might argue that since the js-files could contain for instance a line like "
document.write('<!--');
", there is no way of knowing if any of the content following a script-tag will ever be show, before the script has been executed.But I would assume that it is far more probable that the content would be shown than not. And in these days it is quite common for pages to reference many external javascript files (ajax-libraries, statistics and advertising), which with the current behavior causes the page load to be serialized.
So essentially, the html parser continues reading through the html file and loading referenced links, even if it is blocked from rendering due to a script.
It's called "speculative" because the script might do things like setting css parameters like "display: none
" or commenting out sections of the following html, and by doing so, making certian loads unnecessary... However, in the 95% use case, most of the references will be loaded, so the parser is usually guessing correctly.
I think it means that when the browser would normally block (for example for a script tag), it will continue to parse the HTML. It will not create an actual DOM until the missing pieces are loaded, but it will start fetching script files and stylesheets in the background.