What the hell are JS arrays?
An array is a data structure consisting of a collection of elements (values or variables), of same memory size, each identified by at least one array index or key Wikipedia
As a data structure, array demonstrates some important traits. Notably, its access time is O(1), i.e. it's always takes constant time to access any element by index. This efficiency comes from a simple mathematical operation: given the starting address of the array, the size of each element, and the index, we can directly compute the memory address of any element. Specifically, we calculate it by adding element size × index to the starting address. This is possible because all elements in an array occupy the same amount of memory.
Now, let's look at the JS "array":
const arr = [1, 2, 3];
So far so good! It looks like array and behaves like one. How about this one?
const arr2 = [1, 2, "oops", false, {}, 4, ["what ever am I"]];
This is... not how arrays are supposed to look like. Its access time is difficult to predict, because we can't do simple multiplication anymore.
When an array in JS has at least one outstanding element, it behavior changes. In my test, access by index in an uniform array is ~2.5x faster. Why it happens?
Elements kind lattice in V8
In this article, I'm mostly talking about V8 JS engine and its quirks, since it's a default for Node.js.
An array in V8 starts its life in "fast" state, i.e. behaves like an usual array. It later becomes deoptimized if V8 detects an element of different type. A deoptimized array behaves essentially as a glorified Object with numeric keys. There are four stages of deoptimization in V8, based on the element kinds in the array:
- PACKED_SMI_ELEMENTS: Array contains only small integers ("SMI" = "small integer", usually up to 32 bits)
- PACKED_DOUBLE_ELEMENTS: Only numbers (including floats)
- PACKED_ELEMENTS: Mixed types but densely packed
- HOLEY_ELEMENTS: Arrays with gaps/holes
Unfortunately, deoptimization is a one-way ticket: once an array is deoptimized, there is no going back (with a sole exception, see later). Let's check together:
const uniformArray1 = createUniformNumericArray(arraySize);
const uniformArray2 = createUniformNumericArray(arraySize);
uniformArray2.push("oops");
uniformArray2.pop();
Even though the arrays have identical content, the second one performed significantly slower.
Holey arrays
A holey array in JavaScript is an array that has one or more missing (unset) indices, creating holes. These holes are different from undefined values and occur when elements are omitted or deleted.
In the context of optimization/deoptimization, they are a bit tricky. Sometimes V8 _somehow_ still optimizes them (or decides not to deoptimize them in the first place).
const uniformArray1 = createUniformNumericArray(arraySize)
const uniformArray2 = [...uniformArray1];
delete uniformArray2[10];
To my great surprise, uniformArray2
access is not slower!
Currently (June '25) I'm aware of one single case when an array can be optimized again. It could happen with Array.prototype.fill
(link), but it's not used in my example, so I'm not sure what it the reason for this behavior.
String arrays
String array performance looks pretty bleak: they start as `PACKED_ELEMENTS` by default. In the following test, an array of strings performs even worse than a mixed-values array, because V8 still tries to optimize the latter, but bails on of the former. In my test, uniform string array access is ~1.5x slower than the mixed one. Explicitly marking the string as constant doesn't change anything.
It's interesting how V8's heuristics can lead to unexpected performance effects. The fact that "trying and partially failing" can be faster than "not trying at all" is an example of the complexity hidden beneath JavaScript's simple syntax.>
What about other operations?
I didn't do extensive testing for all possible cases. I have another interesting find though: search operation works faster for mixed arrays. In my test on a 10M-sized array, search in a mixed array was roughly 20% faster. It looks counter-intuitive, but actually makes sense, since the mixed array is, as mentioned before, basically an Object, and Object is basically a hash map with constant search time. I wonder how expensive the conversion from a plain array to a hash map is, since hash generation isn't free.
Practical implications
Surprisingly, not as many as it looks! Access, even for deoptimized arrays, is very fast (like, nanoseconds fast), so it's hardly a real bottleneck, as long as you're not working with huge arrays (millions of elements).
There are certain niches when it still can be important, such as binary file handling, WebGL, Web Audio API or text encoding/decoding. All of them are relatively recent additions (relatively to the JS age).
It seems that ECMAScript people are well aware of JS arrays weirdness, since we have TypedArrays for quite some time:
const arr = new Uint8Array([1, 2, 3]);
Here, Uint8Array
enforces a fixed 8-bit integer type, preventing the type-mixing that triggers V8’s deoptimization. This guarantees consistent performance, as V8 doesn’t need to navigate the elements kind lattice. As an added bonus, TypedArrays allow working with smaller allocations, like 8 bits in this example, compared to default 64 bits of JS Number
. Also, signed/unsigned numbers.
TypedArrays behaves pretty much the same as "normal" arrays, and just like "normal" arrays, they have fixed length and predictable Big O. Ironically, they are not considered "real" arrays by JS:
const arr = new Uint8Array(10);
console.log(Array.isArray(arr)); // false
Consequently, they lack lots of regular JS arrays convenience methods, such as push/pop/shift/unshift
. For regular app-level business logic, using TypedArrays is generally an overkill, but they are useful in some niche areas mentioned above.
Conclusion
...what a mess. And this is just lookup by index, the simplest array operation. Brendan Eich said he created the first version of JS in 10 sleepless nights, and somehow I have no doubts about that.