Big O of JavaScript arrays

前端 未结 2 649
执笔经年
执笔经年 2020-11-28 01:40

Arrays in JavaScript are very easy to modify by adding and removing items. It somewhat masks the fact that most languages arrays are fixed-size, and require complex operatio

相关标签:
2条回答
  • 2020-11-28 02:05

    NOTE: While this answer was correct in 2012, engines use very different internal representations for both objects and arrays today. This answer may or may not be true.

    In contrast to most languages, which implement arrays with, well, arrays, in Javascript Arrays are objects, and values are stored in a hashtable, just like regular object values. As such:

    • Access - O(1)
    • Appending - Amortized O(1) (sometimes resizing the hashtable is required; usually only insertion is required)
    • Prepending - O(n) via unshift, since it requires reassigning all the indexes
    • Insertion - Amortized O(1) if the value does not exist. O(n) if you want to shift existing values (Eg, using splice).
    • Deletion - Amortized O(1) to remove a value, O(n) if you want to reassign indices via splice.
    • Swapping - O(1)

    In general, setting or unsetting any key in a dict is amortized O(1), and the same goes for arrays, regardless of what the index is. Any operation that requires renumbering existing values is O(n) simply because you have to update all the affected values.

    0 讨论(0)
  • 2020-11-28 02:12

    guarantee

    There is no specified time complexity guarantee for any array operation. How arrays perform depends on the underlying datastructure the engine chooses. Engines might also have different representations, and switch between them depending on certain heuristics. The initial array size might or might not be such an heuristic.

    reality

    For example, V8 uses (as of today) both hashtables and array lists to represent arrays. It also has various different representations for objects, so arrays and objects cannot be compared. Therefore array access is always better than O(n), and might even be as fast as a C++ array access. Appending is O(1), unless you reach the size of the datastructure and it has to be scaled (which is O(n)). Prepending is worse. Deletion can be even worse if you do something like delete array[index] (don't!), as that might force the engine to change its representation.

    advice

    Use arrays for numeric datastructures. That's what they are meant for. That's what engines will optimize them for. Avoid sparse arrays (or if you have to, expect worse performance). Avoid arrays with mixed datatypes (as that makes internal representations more complex).

    If you really want to optimize for a certain engine (and version), check its sourcecode for the absolute answer.

    0 讨论(0)
提交回复
热议问题