A response on SO got me thinking, does JavaScript guarantee a certain endian encoding across OSs and browsers?
Or put another way are bitwise shifts on integers \"safe\"
ECMA Script does actually have a concept of an integer type but it is implicitly coerced to or from a double-precision floating-point value as necessary (if the number represented is too large or if it has a fractional component).
Many mainstream Javascript interpreters (SpiderMonkey is an example) take a shortcut in implementation and interpret all numeric values as doubles to avoid checking the actual native type of the value for each instruction. As a result of the implementation hack, bit operations are implemented as a cast to an integral type followed by a cast back to a double representation. It is therefore not a good idea to use bit-level operations in Javascript and you won't get a performance boost anyway.