A response on SO got me thinking, does JavaScript guarantee a certain endian encoding across OSs and browsers?
Or put another way are bitwise shifts on integers \"safe\"
Some of these answers are dated, because endianness can be relevant when using typed arrays! Consider:
var arr32 = new Uint32Array(1);
var arr8 = new Uint8Array(arr32.buffer);
arr32[0] = 255;
console.log(arr8[0], arr8[1], arr8[2], arr8[3]);
When I run this in Chrome's console, it yields 255 0 0 0
, indicating that my machine is little-endian. However, typed arrays use the system endianness by default, so you might see 0 0 0 255
instead if your machine is big-endian.