What\'s faster for including scripts, using CDN (Google) or storing them locally in website\'s root?
Its is even faster to store them in the localStorage / web storage. I have created a tiny library to do that and the results are quite convincing
you can check out the code at https://github.com/webpgr/cached-webpgr.js it is tiny and you read and understand all of it within a couple of minutes.
Here is a full example how to use it.
The complete library:
function _cacheScript(c,d,e){var a=new XMLHttpRequest;a.onreadystatechange=function(){4==a.readyState&&(200==a.status?localStorage.setItem(c,JSON.stringify({content:a.responseText,version:d})):console.warn("error loading "+e))};a.open("GET",e,!0);a.send()}function _loadScript(c,d,e,a){var b=document.createElement("script");b.readyState?b.onreadystatechange=function(){if("loaded"==b.readyState||"complete"==b.readyState)b.onreadystatechange=null,_cacheScript(d,e,c),a&&a()}:b.onload=function(){_cacheScript(d,e,c);a&&a()};b.setAttribute("src",c);document.getElementsByTagName("head")[0].appendChild(b)}function _injectScript(c,d,e,a){var b=document.createElement("script");b.type="text/javascript";c=JSON.parse(c);var f=document.createTextNode(c.content);b.appendChild(f);document.getElementsByTagName("head")[0].appendChild(b);c.version!=e&&localStorage.removeItem(d);a&&a()}function requireScript(c,d,e,a){var b=localStorage.getItem(c);null==b?_loadScript(e,c,d,a):_injectScript(b,c,d,a)};
Calling the library
requireScript('jquery', '1.11.2', 'http://ajax.googleapis.com/ajax/libs/jquery/1.11.2/jquery.min.js', function(){
requireScript('examplejs', '0.0.3', 'example.js');
});
If you mean the core jQuery libraries, use the google CDN for an internet-facing site (as opposed to an internal one).
The CDN has the following advantages you'll find hard to compete with:
Though you can configure the cache headers just like they do, you probably can't serve the file faster. That being said, the library/CDN is only part of the puzzle. Miscellaneous plugins and code you have should also be minified, combined and served via gzip.
Unfortunately studies have recently shown that Googles CDN actually hinders performance.
Google's AJAX Libraries API tried to uses network effects to improve performance of all participating websites by providing a common shared cache. However recent research has discovered that too few people use the network for it to hit critical mass and actually improve web performance. Currently the overhead in using the network means using Google's AJAX Libraries API actually lowers performance. You should host the JavaScript file locally. This will increase your bandwdith consumption but improve page load speed. From Zoompf.com performance report note.
See here also Should You Use JavaScript Library CDNs?
It may be worth noting that Visual Studio has problems with Intellisense for Google's CDN. Microsoft also has a CDN that allows Intellisense to function correctly. But yes, use a CDN.
I'd say Google's CDN for reasons others have stated.
However, if your target market is in close proximity to your server it may be better to server it from your server.
Say, for instance, you have a site forOrlandoFloridaPeopleOnly.com/. If your server is hosted in Orlando, Florida and Google's closest content delivery servers are in Miami, Florida and Atlanta, Georgia (which is true), your server may [will probably] be faster if the visitor didn't already have a cached copy of the file from Google CDN.
Remember, if you do serve static content to your visitors form your server try to parallelize the downloads by utilizing sub-domains or other means. And for goodness sakes...don't transfer cookies for static content.
I'm not sure how reliable this source is: Google data centers. So don't necessarily count on it.
A Google CDN :-)
1) Optimized from cache point of view
2) User receives the resource from the more optimal CDN node