Optimizing javascript and css requests

后端 未结 11 1786
佛祖请我去吃肉
佛祖请我去吃肉 2020-12-31 17:21

I need to optimize the loading speed of several existing websites. One of the issues that I have is the amount of requests per page. The websites have 7 or more different ty

相关标签:
11条回答
  • 2020-12-31 17:31

    Like some others have said, put used-on-more-than-one-page scripts all together into a main.js and then if there are page specific ones: home.js, another_page.js, etc.

    The only thing I really wanted to add was that for libraries like jQuery, you should use something like Google's Libraries API.

    • If your user has visited a different site which also uses Google's servers for the libraries, then they'll arrive with a primed cache! Win!
    • I'm going to go out on a limb here and bet that Google's servers are faster than yours.
    • Because the requests are going to different servers, clients can simultaneously process two requests to the Google servers (eg: jQuery & jQuery UI) as well as two requests to your servers (main.js & main.css)

    Oh, and finally -- don't forget to turn gzipping on your server!

    0 讨论(0)
  • 2020-12-31 17:33

    Minification and combining JS are only part of the battle. The placement of the files is more important. Obtrustive javascript should be the last loaded thing on the page as it will halt page load until it's done loading.

    Consolidate what you can but namespacing and using closures may help keep the DOM clean of your function calls until they're needed.

    There are tools that can test page load speed.

    The Net Panel in Firebug as well as Yslow are handy tools that you can use to help debug page load speed.

    Good luck and happy javascripting!

    0 讨论(0)
  • 2020-12-31 17:34

    As always: it depends. The bigger the page-specific files are, the more sense it makes to keep them separate. If they're not big (think 10 kB minified) it probably makes more sense to join, minimize and compress them, so you can save some requests and rely on caching.

    0 讨论(0)
  • 2020-12-31 17:36

    One thing you should do is optimize your .htaccess file to compress and cache files properly:

    # compress text, html, javascript, css, xml:  
    AddOutputFilterByType DEFLATE text/plain  
    AddOutputFilterByType DEFLATE text/html  
    AddOutputFilterByType DEFLATE text/xml  
    AddOutputFilterByType DEFLATE text/css  
    AddOutputFilterByType DEFLATE application/xml  
    AddOutputFilterByType DEFLATE application/xhtml+xml  
    AddOutputFilterByType DEFLATE application/rss+xml  
    AddOutputFilterByType DEFLATE application/javascript  
    AddOutputFilterByType DEFLATE application/x-javascript
    
    # cache media files
    <FilesMatch ".(flv|gif|jpg|jpeg|png|ico|swf|js|css|pdf)$">  
    Header set Cache-Control "max-age=2592000"  
    </FilesMatch> 
    
    0 讨论(0)
  • 2020-12-31 17:37

    In general, I concatenate and minify all the scripts on the site and serve only two requests - one for JS and one for CSS. The exception to that rule is if a certain page has a significantly sized script that is only run there - in that case it should be loaded separately.

    Load all the JS scripts at the bottom of the your page to prevent scripts from blocking page load.

    0 讨论(0)
  • 2020-12-31 17:39

    Depending on your development environment, you might consider automating the process. It is a fair bit more work up front, but I found it has been worth it in the long run. How you would go about doing that depends largely on your project and environment. There are several options, but I will explain (high level) what we did.

    In our case, we have several ASP.NET based websites. I wrote an ASP.NET control that simply contains a list of static dependencies - CSS and JavaScript. Each page lists what it needs. We have some pages with 7 or 8 JS dependencies and 4 or 5 CSS dependencies, depending on what shared libraries/controls are being used. The first time the page loads, I create a new background worker thread that evaluates all the static resources, combines them into a single file (1 for CSS, 1 for JS), and then performs minification on them using the Yahoo Yui Compressor (can do both JS and CSS). I then output the file into a new "merged" or "optimized" directory.

    The next time someone loads that page, the ASP.NET control sees the optimized version of the resource, and loads that instead of the list of 10-12 other resources.

    Furthermore, it is designed to only load the optimized resources when the project is running in "RELEASE" mode (as opposed to DEBUG mode inside Visual Studio). This is fantastic because we can keep different classes, pages, controls, etc. separate for organization (and sharing across projects), but we still get the benefit of optimized loading. It is a completely transparent process that requires no additional attention (once it is working). We even went back and added a condition where the non-optimized resources were loaded if "debug=true" was specified in the query string of the URL for cases where you need to verify/replicate bugs in production.

    0 讨论(0)
提交回复
热议问题