Reduce HTTP requests or not?

前端 未结 7 1979
野性不改
野性不改 2021-02-01 05:57

A theoretical question:
We all know about the pro\'s of minifying and combining javascript files in order to reduce HTTP requests to speed up a website. But when popular jav

相关标签:
7条回答
  • 2021-02-01 06:18

    I think it comes down to a couple of things:

    1. How many pages use the code throughout your site
    2. The quality of the CDN
    3. How much code it is

    There's also a difference between using popular Javascript packages, such as jQuery, and using a personal package, which only visitors who have visited your site will have.

    The performance enhancement may occur from two places: 1) browser cache and 2) dns cache, which even if the file isn't stored locally, the dns server has a route that minimizes the request time, or even temporarily serves the file.

    I would advise using a CDN and hosting the files locally. Depending on your resources (hardware/bandwidth), you might need to use a CDN anyhow. It'd be nice to use server side schedulers to check on the CDN status and reroute the path when applicable.

    Also, take a reminder that some users choose to turn off their browser cache. So minifying your JS is always a plus. You should separate your JS into two files: 1) Needed on Load and 2) Not needed on load. Basically, get the necessary code out there first, to improve the perceived load time. Then load all the other extras (eg slideshows, color changers, etc).

    One last point is to make use of the Expires headers, since none of this is important if you don't optimize that. That is what will really reduce the speed for returned visitors with cache enabled. YSlow is a nice Firefox addon that will help evaluate your load performance.


    To answer your question: Reduce HTTP requests, but do your own evaluation on the file size of the JS.

    (Being Extreme) You don't want one 10MB JS file, or your site will take too long to load. Nor do you want 1000 10KB files, because of the HTTP overhead. Again, use this to illustrate the point that you want a balance between size and number of files - and as I said earlier, package them into performance needed vs wanted.

    0 讨论(0)
  • 2021-02-01 06:21

    Particular to your question about how the big guys of the industry handle client side scripts, you could always look and see. stackoverflow.com seems fine relying on Google's version of the jquery lib. Others most decidedly do not....

    0 讨论(0)
  • 2021-02-01 06:25

    There's no particular reason that getting one script would be faster than getting it split. It happens because browser concurrent downloads are limited, but not by one.

    I think the idea should be to handle synchronous UI scripts first, and then the "user activity response" scripts (like validation, etc).

    All else given equal, option B looks as the best one.

    0 讨论(0)
  • 2021-02-01 06:32

    I think the best approach is to use a minified 'application.js' file (containing all application specific javascript) and then use a service such as Google AJAX Libraries API (found here) to load jQuery, Prototype, etc.

    0 讨论(0)
  • 2021-02-01 06:33

    I think it depends on you site:

    • If you site consists mainly of pages of the same type which need the same scripts I would go for A)
    • If you have a lot of scripts that differ from each sub site of you site I would go for B). Combine the most used scripts together in one script. If you have large scripts that are not used on every page, make a separate script for it.

    The best way to really know what to do is to test which combination of techniques saves you the the most traffic / connections.

    P.S.: I personally do not like the idea to let other people serve files for my webpage, because what will happen if the CDN fails, but your server stays alive? If this is not a problem for you, try to server all libraries you use from a reliable CDN.

    0 讨论(0)
  • 2021-02-01 06:35
    • "Combine and minify each script into a massive one and serve it from my own CDN"

      If you have a CDN, what are we talking about? :) You mean server, right?

    • "How does the big guys in the industry handle it?"

      The big guys always use their own servers.

    • "...it isn't too stupid to assume that these already have been downloaded to the clients computer from another page."

      Unfortunately it is. Facts:

      • 40-60% of users have an empty cache experience
      • browsers' cache limits are small
      • different versions of libraries are in use, cache only happens if they match
      • resource from a new domain creates a DNS lookup, which is slow
      • +you need to manage dependencies
    0 讨论(0)
提交回复
热议问题