Getting closure-compiler and Node.js to play nice

前端 未结 5 1690
清酒与你
清酒与你 2020-12-05 10:04

Are there any projects that used node.js and closure-compiler (CC for short) together?

The official CC recommendation is to compile all code for an application toget

相关标签:
5条回答
  • 2020-12-05 10:33

    I replaced my old approach with a way simpler approach:

    New approach

    • No require() calls for my own app code, only for Node modules
    • I need to concatenate server code to a single file before I can run or compile it
    • Concatenating and compiling is done using a simple grunt script

    Funny thing is that I didn't even had to add an extern for the require() calls. The Google Closure compiler understands that automagically. I did have to add externs for nodejs modules that I use.

    Old approach

    As requested by OP, I will elaborated on my way of compiling node.js code with Google Closure Compiler.

    I was inspired by the way bolinfest solved the problem and my solution uses the same principle. The difference is that I made one node.js script that does everything, including inlining modules (bolinfest's solution lets GCC take care of that). This makes it more automated, but also more fragile.

    I just added code comments to every step I take to compile server code. See this commit: https://github.com/blaise-io/xssnake/commit/da52219567b3941f13b8d94e36f743b0cbef44a3

    To summarize:

    1. I start with my main module, the JS file that I pass to Node when I want to run it.
      In my case, this file is start.js.
    2. In this file, using a regular expression, I detect all require() calls, including the assignment part.
      In start.js, this matches one require call: var Server = require('./lib/server.js');
    3. I retrieve the path where the file exists based on the file name, fetch its contents as a string, and remove module.exports assignments within the contents.
    4. Then I replace the require call from step 2 with the contents from step 3. Unless it is a core node.js module, then I add it to a list of core modules that I save for later.
    5. Step 3 will probably contain more require() call, so I repeat step 3 and 4 recursively until all require() calls are gone and I'm left with one huge string containing all code.
    6. If all recursion has completed, I compile the code using the REST API.
      You could also use the offline compiler.
      I have externs for every core node.js module. This tool is useful for generating externs.
    7. I preprend the removed core.js module require calls to the compiled code.

    Pre-Compiled code.
    All require calls are removed. All my code is flattened.
    http://pastebin.com/eC2rVMiN

    Post-Compiled code.
    Node.js core require calls have been prepended manually.
    http://pastebin.com/uB8CaejN


    Why you should not do it this way:

    1. It uses regular expressions (not a parser or tokenizer) for detecting require calls, inlining and removing module.exports. This is fragile, as it does not cover all syntax variations.
    2. When inlining, all module code is added to the global namespace. This is against the principles of Node.js, where every file has its own namespace, and this will cause errors if you have two different modules with the same global variables.
    3. It does not improve the speed of your code that much, since V8 also performs a lot of code optimizations like inlining and dead code removal.

    Why you should:

    1. Because it does work when you have consistent code.
    2. It will detect errors in your server code when you enable verbose warnings.

    0 讨论(0)
  • 2020-12-05 10:34

    Option 4: Don't use closure compiler.

    People in the node community don't tend to use it. You don't need to minify node.js source code, that's silly.

    There's simply no good use for minification.

    As for the performance benefits of closure, I personally doubt it actually makes your programs faster.

    And of course there's a cost, debugging compiled JavaScript is a nightmare

    0 讨论(0)
  • 2020-12-05 10:45

    Closure Library on Node.js in 60 seconds.

    It's supported, check https://code.google.com/p/closure-library/wiki/NodeJS.

    0 讨论(0)
  • 2020-12-05 10:48

    I have been using the Closure Compiler with Node for a project I haven't released yet. It has taken a bit of tooling, but it has helped catch many errors and has a pretty short edit-restart-test cycle.

    First, I use plovr (which is a project that I created and maintain) in order to use the Closure Compiler, Library, and Templates together. I write my Node code in the style of the Closure Library, so each file defines its own class or collection of utilities (like goog.array).

    The next step is to create a bunch of externs files for the Node functions you want to use. I published some of these publicly at:

    https://github.com/bolinfest/node-google-closure-latitude-experiment/tree/master/externs/node/v0.4.8

    Though ultimately, I think that this should be a more community driven thing because there are a lot of functions to document. (It's also annoying because some Node functions have optional middle arguments rather than last arguments, making the type annotations complicated.) I haven't started this movement myself because it's possible that we could do some work with the Closure Complier to make this less awkward (see below).

    Say you have created the externs file for the Node namespace http. In my system, I have decided that anytime I need http, I will include it via:

    var http = require('http');
    

    Though I do not include that require() call in my code. Instead, I use the output-wrapper feature of the Closure Compiler the prepend all of the require()s at the start of the file, which when declared in plovr, in my current project looks like this:

    "output-wrapper": [
      // Because the server code depends on goog.net.Cookies, which references the
      // global variable "document" when instantiating goog.net.cookies, we must
      // supply a dummy global object for document.
      "var document = {};\n",
    
      "var bee = require('beeline');\n",
      "var crypto = require('crypto');\n",
      "var fs = require('fs');\n",
      "var http = require('http');\n",
      "var https = require('https');\n",
      "var mongodb = require('mongodb');\n",
      "var nodePath = require('path');\n",
      "var nodeUrl = require('url');\n",
      "var querystring = require('querystring');\n",
      "var SocketIo = require('socket.io');\n",
      "%output%"
    ],
    

    In this way, my library code never calls Node's require(), but the Compiler tolerates the uses of things like http in my code because the Compiler recognizes them as externs. As they are not true externs, they have to be prepended as I described.

    Ultimately, after talking about this on the discussion list, I think the better solution is to have a new type annotation for namespaces that would look something like:

    goog.scope(function() {
    
        /** @type {~NodeHttpNamesapce} */
        var http = require('http');
    
        // Use http throughout.
    
    });
    

    In this scenario, an externs file would define the NodeHttpNamespace such that the Closure Compiler would be able to typecheck properties on it using the externs file. The difference here is that you could name the return value of require() whatever you wanted because the type of http would be this special namespace type. (Identifying a "jQuery namespace" for $ is a similar issue.) This approach would eliminate the need to name your local variables for Node namespaces consistently, and would eliminate the need for that giant output-wrapper in the plovr config.

    But that was a digression...once I have things set up as described above, I have a shell script that:

    1. Uses plovr to build everything in RAW mode.
    2. Runs node on the file generated by plovr.

    Using RAW mode results in a large concatenation of all the files (though it also takes care of translating Soy templates and even CoffeeScript to JavaScript). Admittedly, this makes debugging a pain because the line numbers are nonsense, but has been working well enough for me so far. All of the checks performed by the Closure Compiler have made it worth it.

    0 讨论(0)
  • 2020-12-05 10:53

    The svn HEAD of closure compiler seems to have support for AMD

    0 讨论(0)
提交回复
热议问题