With the new HTTP/2 protocol the overhead created by repeated HTTP requests to the same server has been greatly reduced.
With this in mind, are there still any significa
Minifying JS can still reduce the size of many symbols; inflatedJargonSymbolizerTokenManager
will become _a
. One example I found showed that JQuery GZipped was still twice the size of JQuery.min GZipped.
I also want to note that while you didn't imply otherwise, dystroy's comment is correct, and in fact contradicts the badly-written Wikipedia explanation; "Concatenating" JavaScript files might be less useful now. Minifying them still has its benefits. Just wanted to mention that, in case you happened to get some information there. In fact I'd edit the page myself if I wasn't worried about getting into an edit battle.
CSS likely has fewer opportunities for symbol reduction. Theoretically, all it would get is whitespace and comment removal.
This may be a little late, but I want to point out a few alternative points that should be covered too.
The first is that minification normally employs some sort of uglification for JavaScript, which has benefits outside bandwidth - it prevents people from easily analyzing the code, which prevents normal users from using verbose methods and ideas malicious actions - even well-built sites can have problems with this. Of course, this is no substitute for security, and advanced users could always decipher uglified code.
The other is that not all browsers or connections are going to be using HTTP/2, at least not immediately - so if the performance of some HTTP/2 feature is barely noticeable on HTTP/2 clients, why not benefit those connecting still over HTTP/1.1?
Lastly, at the end of the day, the best way to determine how anything impacts the speed of your server is to benchmark it.
So far, all the answers tacitly assume that you'll want to download ALL the .CSS and .JS files for every page. A benefit from using http/2 and keeping .CSS and .JS files separate is that you can only bring down the ones you need, and not downloading something is always faster than efficiently downloading it.
They're still useful. HTTP/2 reduces the impact of some of these practices, but it doesn't eliminate their impact.
Minification remains as useful as ever. Although HTTP/2 introduces new compression for message headers, that has nothing to do with minification (which is about message bodies). The compression algorithms for message bodies are the same, so minification saves just as much bandwidth as it did before.
Concatenation and sprites will have less of an impact than before, but they will still have some impact. The biggest issue with downloading multiple files instead of a single file with HTTP/1 isn't actually an HTTP-side problem, per se: there is some bandwidth-based overhead in requesting each file individually, but it's dwarfed by the time-based overhead of tearing down the TCP/IP session when you're done with one file, then starting up a new one for the next, and repeating this for every file you want to download.
The biggest focus of HTTP/2 is eliminating that time-based overhead: HTTP/1.1 tried to do this with pipelining, but it didn't catch on in the browser (Presto is the only engine that got it completely right, and Presto is dead). HTTP/2 is another attempt, which improves on HTTP/1.1's methods while also making this kind of thing non-optional, and it stands to be more successful. It also eliminates some of the bandwidth-based overhead in making multiple requests, by compressing headers, but it cannot eliminate that overhead completely, and when downloading multiple files, those requests still have to be made (as part of a single TCP/IP session, so there is less overhead, but not zero). So while the impact of concatenating and spriting is proportionally smaller, there is still some impact, especially if you use many files.
Another thing to consider, when it comes to concatenation and spriting, is compression. Concatenated files of similar types tend to compress better than the individual files do, because the compression algorithm can exploit similarities between the concatenated pieces of data. A similar principle applies to sprites: putting similar images in different regions of the same file usually results in a smaller file, because the image's compression can exploit similarities in the different regions.
Yes, it is still useful.
Along-side with gzip compression, you page will weight less.
Imagine you are using a very slow GPRS (56Kbps, 500ms ping) network.
You have 50 tiny images, 30 javascripts and 20 css files.
This means that, with 2 parallel connections, you must wait over 100 * 500ms just for the requests.
Now, each image is about 3-4kb. Which might take a few milliseconds (5-8?).
Now, the CSS files and Javascript range from 20Kb to 600Kb.
This will kill your website with a huge transfer time.
Reducing the time to transfers the files will increase the 'speed' at which the website will load.
So, YES, it is still useful!