What is the accepted way to send 64-bit values over JSON?

后端 未结 6 637
死守一世寂寞
死守一世寂寞 2020-12-08 20:05

Some of my data are 64-bit integers. I would like to send these to a JavaScript program running on a page.

However, as far as I can tell, integers in most JavaScript

相关标签:
6条回答
  • 2020-12-08 20:40

    This thing happened to me. All hell broke loose when sending large integers via json into JSON.parse. I spent days trying to debug. Problem immediately solved when i transmitted the values as strings.

    Use { "the_sequence_number": "20200707105904535" } instead of { "the_sequence_number": 20200707105904535 }

    To make it worse, it would seem that where every JSON.parse is implemented, is some shared lib between Firefox, Chrome and Opera because they all behaved exactly the same. Opera error messages have Chrome URL references in it, almost like WebKit shared by browsers.

    console.log('event_listen[' + global_weird_counter + ']: to be sure, server responded with [' + aresponsetxt + ']');
    var response = JSON.parse(aresponsetxt);
    console.log('event_listen[' + global_weird_counter + ']: after json parse: ' + JSON.stringify(response));
    

    The behaviour i got was the sort of stuff where pointer math went horribly bad. Ghosts were flying out of my workstation wreaking havoc in my sleep. They are all exorcised now that i switched to string.

    0 讨论(0)
  • 2020-12-08 20:48

    Javascript's Number type (64 bit IEEE 754) only has about 53 bits of precision.

    But, if you don't need to do any addition or multiplication, then you could keep 64-bit value as 4-character strings as JavaScript uses UTF-16.

    For example, 1 could be encoded as "\u0000\u0000\u0000\u0001". This has the advantage that value comparison (==, >, <) works on strings as expected. It also seems straightforward to write bit operations:

    function and64(a,b) {
        var r = "";
        for (var i = 0; i < 4; i++)
            r += String.fromCharCode(a.charCodeAt(i) & b.charCodeAt(i));
        return r;
    }
    
    0 讨论(0)
  • 2020-12-08 20:53

    There is in fact a limitation at JavaScript/ECMAScript level of precision to 53-bit for integers (they are stored in the mantissa of a "double-like" 8 bytes memory buffer). So transmitting big numbers as JSON won't be unserialized as expected by the JavaScript client, which would truncate them to its 53-bit resolution.

    > parseInt("10765432100123456789")
    10765432100123458000
    

    See the Number.MAX_SAFE_INTEGER constant and Number.isSafeInteger() function:

    The MAX_SAFE_INTEGER constant has a value of 9007199254740991. The reasoning behind that number is that JavaScript uses double-precision floating-point format numbers as specified in IEEE 754 and can only safely represent numbers between -(2^53 - 1) and 2^53 - 1.

    Safe in this context refers to the ability to represent integers exactly and to correctly compare them. For example, Number.MAX_SAFE_INTEGER + 1 === Number.MAX_SAFE_INTEGER + 2 will evaluate to true, which is mathematically incorrect. See Number.isSafeInteger() for more information.

    Due to the resolution of floats in JavaScript, using "64-bit floating point numbers" as you proposed would suffer from the very same restriction.

    IMHO the best option is to transmit such values as text. It would be still perfectly readable JSON content, and would be easy do work with at JavaScript level.

    A "pure string" representation is what OData specifies, for its Edm.Int64 or Edm.Decimal types.

    What the Twitter API does in this case, is to add a specific ".._str": field in the JSON, as such:

    {
       "id": 10765432100123456789,           // for JSON compliant clients
       "id_str": "10765432100123456789",     // for JavaScript
        ...
    }
    

    I like this option very much, since it would be still compatible with int64 capable clients. In practice, such duplicated content in the JSON won't hurt much, if it is deflated/gzipped at HTTP level.

    Once transmitted as string, you may use libraries like strint – a JavaScript library for string-encoded integers to handle such values.

    0 讨论(0)
  • 2020-12-08 20:56

    The JS number representation is a standard ieee double, so you can't represent a 64 bit integer. iirc you get maybe 48 bits of actual int precision in a double, but all JS bitops reduce to 32bit precision (that's what the spec requires. yay!) so if you really need a 64bit int in js you'll need to implement your own 64 bit int logic library.

    0 讨论(0)
  • 2020-12-08 20:59

    JSON itself doesn't care about implementation limits. your problem is that JS can't handle your data, not the protocol. In other words, your JS client code has to use either of those non-perfect options.

    0 讨论(0)
  • 2020-12-08 21:03

    This seems to be less a problem with JSON and more a problem with Javascript itself. What are you planning to do with these numbers? If it's just a magic token that you need to pass back to the website later on, by all means simply use a string containing the value. If you actually have to do arithmetic on the value, you could possibly write your own Javascript routines for 64-bit arithmetic.

    One way that you could represent values in Javascript (and hence JSON) would be by splitting the numbers into two 32-bit values, eg.

      [ 12345678, 12345678 ]
    

    To split a 64-bit value into two 32-bit values, do something like this:

      output_values[0] = (input_value >> 32) & 0xffffffff;
      output_values[1] = input_value & 0xffffffff;
    

    Then to recombine two 32-bit values to a 64-bit value:

      input_value = ((int64_t) output_values[0]) << 32) | output_values[1];
    
    0 讨论(0)
提交回复
热议问题