JavaScript, Node.js: is Array.forEach asynchronous?

后端 未结 10 1443
时光说笑
时光说笑 2020-11-22 10:47

I have a question regarding the native Array.forEach implementation of JavaScript: Does it behave asynchronously? For example, if I call:

[many          


        
相关标签:
10条回答
  • 2020-11-22 11:27

    Edit 2018-10-11: It looks like there is a good chance the standard described below may not go through, consider pipelineing as an alternative (does not behave exactly the same but methods could be implemented in a similar manor).

    This is exactly why I am excited about es7, in future you will be able to do something like the code below (some of the specs are not complete so use with caution, I will try to keep this up to date). But basically using the new :: bind operator, you will be able to run a method on an object as if the object's prototype contains the method. eg [Object]::[Method] where normally you would call [Object].[ObjectsMethod]

    Note to do this today (24-July-16) and have it work in all browsers you will need to transpile your code for the following functionality:Import / Export, Arrow functions, Promises, Async / Await and most importantly function bind. The code below could be modfied to use only function bind if nessesary, all this functionality is neatly available today by using babel.

    YourCode.js (where 'lots of work to do' must simply return a promise, resolving it when the asynchronous work is done.)

    import { asyncForEach } from './ArrayExtensions.js';
    
    await [many many elements]::asyncForEach(() => lots of work to do);
    

    ArrayExtensions.js

    export function asyncForEach(callback)
    {
        return Promise.resolve(this).then(async (ar) =>
        {
            for(let i=0;i<ar.length;i++)
            {
                await callback.call(ar, ar[i], i, ar);
            }
        });
    };
    
    export function asyncMap(callback)
    {
        return Promise.resolve(this).then(async (ar) =>
        {
            const out = [];
            for(let i=0;i<ar.length;i++)
            {
                out[i] = await callback.call(ar, ar[i], i, ar);
            }
            return out;
        });
    };
    
    0 讨论(0)
  • 2020-11-22 11:32

    There is a package on npm for easy asynchronous for each loops.

    var forEachAsync = require('futures').forEachAsync;
    
    // waits for one request to finish before beginning the next 
    forEachAsync(['dogs', 'cats', 'octocats'], function (next, element, index, array) {
      getPics(element, next);
      // then after all of the elements have been handled 
      // the final callback fires to let you know it's all done 
      }).then(function () {
        console.log('All requests have finished');
    });
    

    Also another variation forAllAsync

    0 讨论(0)
  • 2020-11-22 11:45

    No, it is blocking. Have a look at the specification of the algorithm.

    However a maybe easier to understand implementation is given on MDN:

    if (!Array.prototype.forEach)
    {
      Array.prototype.forEach = function(fun /*, thisp */)
      {
        "use strict";
    
        if (this === void 0 || this === null)
          throw new TypeError();
    
        var t = Object(this);
        var len = t.length >>> 0;
        if (typeof fun !== "function")
          throw new TypeError();
    
        var thisp = arguments[1];
        for (var i = 0; i < len; i++)
        {
          if (i in t)
            fun.call(thisp, t[i], i, t);
        }
      };
    }
    

    If you have to execute a lot of code for each element, you should consider to use a different approach:

    function processArray(items, process) {
        var todo = items.concat();
    
        setTimeout(function() {
            process(todo.shift());
            if(todo.length > 0) {
                setTimeout(arguments.callee, 25);
            }
        }, 25);
    }
    

    and then call it with:

    processArray([many many elements], function () {lots of work to do});
    

    This would be non-blocking then. The example is taken from High Performance JavaScript.

    Another option might be web workers.

    0 讨论(0)
  • 2020-11-22 11:47

    There is a common pattern for doing a really heavy computation in Node that may be applicable to you...

    Node is single-threaded (as a deliberate design choice, see What is Node.js?); this means that it can only utilize a single core. Modern boxes have 8, 16, or even more cores, so this could leave 90+% of the machine idle. The common pattern for a REST service is to fire up one node process per core, and put these behind a local load balancer like http://nginx.org/.

    Forking a child - For what you are trying to do, there is another common pattern, forking off a child process to do the heavy lifting. The upside is that the child process can do heavy computation in the background while your parent process is responsive to other events. The catch is that you can't / shouldn't share memory with this child process (not without a LOT of contortions and some native code); you have to pass messages. This will work beautifully if the size of your input and output data is small compared to the computation that must be performed. You can even fire up a child node.js process and use the same code you were using previously.

    For example:

    var child_process = require('child_process');
    function run_in_child(array, cb) {
        var process = child_process.exec('node libfn.js', function(err, stdout, stderr) {
            var output = JSON.parse(stdout);
            cb(err, output);
        });
        process.stdin.write(JSON.stringify(array), 'utf8');
        process.stdin.end();
    }
    
    0 讨论(0)
  • 2020-11-22 11:53

    If you need an asynchronous-friendly version of Array.forEach and similar, they're available in the Node.js 'async' module: http://github.com/caolan/async ...as a bonus this module also works in the browser.

    async.each(openFiles, saveFile, function(err){
        // if any of the saves produced an error, err would equal that error
    });
    
    0 讨论(0)
  • 2020-11-22 11:53

    Array.forEach is meant for computing stuff not waiting, and there is nothing to be gained making computations asynchronous in an event loop (webworkers add multiprocessing, if you need multi-core computation). If you want to wait for multiple tasks to end, use a counter, which you can wrap in a semaphore class.

    0 讨论(0)
提交回复
热议问题