Working with arrays in V8 (performance issue)

后端 未结 2 1836
清酒与你
清酒与你 2021-01-30 22:40

I tried next code (it shows similar results in Google Chrome and nodejs):

var t = new Array(200000); console.time(\'wtf\'); for (var i = 0; i < 200000; ++i) {         


        
相关标签:
2条回答
  • 2021-01-30 23:35

    Update [2020]

    As of summer 2020, v8 has changed significantly.

    It used to default to dictionary-mode if initial size > some_number_that_is_roughly_10k. That has changed.

    Judging from the Array constructor code, the SetLengthWouldNormalize function and the kMaxFastArrayLength constant, it can now support an almost arbitrarily large amount (currently capped at 32 million) of elements before resorting to dictionary mode.

    Note, however that there are many more considerations at play now, as V8 optimization has become ever more complicated. This official blog post from 2017 explains that arrays can distinguish between 21 different kinds of arrays (or rather, array element kinds), and that - to quote:

    "each of which comes with its own set of possible optimizations"

    I would strongly recommend:

    • starting with that blog post
    • always making sure, your indexing is good (a single out-of-bounds access can put your array into one (of several kinds of) "slow mode")
    • learn how to use the built-in node profiler tools

    Original Post

    As you probably already know, if you pre-allocate an array with > 10000 elements in Chrome or Node (or more generally, in V8), they fall back to dictionary mode, making things uber-slow.

    Thanks to some of the comments in this thread, I was able to track things down to object.h's kInitialMaxFastElementArray.

    I then used that information to file an issue in the v8 repository which is now starting to gain some traction, but it will still take a while. And I quote:

    I hope we'll be able to do this work eventually. But it's still probably a ways away.

    0 讨论(0)
  • 2021-01-30 23:41

    If you preallocate, do not use .push because you will create a sparse array backed by a hashtable. You can preallocate sparse arrays up to 99999 elements which will be backed by a C array, after that it's a hashtable.

    With the second array you are adding elements in a contiguous way starting from 0, so it will be backed by a real C array, not a hash table.

    So roughly:

    If your array indices go nicely from 0 to Length-1, with no holes, then it can be represented by a fast C array. If you have holes in your array, then it will be represented by a much slower hash table. The exception is that if you preallocate an array of size < 100000, then you can have holes in the array and still get backed by a C array:

    var a = new Array(N); 
    
    //If N < 100000, this will not make the array a hashtable:
    a[50000] = "sparse";
    
    var b = [] //Or new Array(N), with N >= 100000
    //B will be backed by hash table
    b[50000] = "Sparse";
    //b.push("Sparse"), roughly same as above if you used new Array with N > 0
    
    0 讨论(0)
提交回复
热议问题