JavaScript - How To Detect Number As A Decimal (Including 1.0)

前端 未结 5 1280
梦毁少年i
梦毁少年i 2021-02-10 15:41

It feels like I am missing something obvious here. This has been asked a number of times - and the answer usually boils down to:

var num = 4.5;
num % 1 === 0; /         


        
5条回答
  •  滥情空心
    2021-02-10 16:14

    Those numbers aren't actually decimals or integers. They're all floats. The only real difference between 1 and 1.0 is the notation that was used to create floats of equal values.


    Edit: to help illustrate, consider:

    1 === 1.0; // true
    parseInt('1') == parseInt('1.0'); // true
    parseFloat('1') === parseFloat('1.0'); // true
    parseInt('1') === parseFloat('1'); // true
    // etc...
    

    Also, to demonstrate that they are really the same underlying data type:

    typeof(1); // 'number'
    typeof(1.0); // 'number'
    

    Also, note that 'number' isn't unambiguous in JavaScript like it would be in other languages, because numbers are always floats.


    Edit 2: One more addition, since it's relevant. To the best of my knowledge, the only context in JavaScript in which you actually have "real and true" integers that aren't really represented as floats, is when you're doing bitwise operations. However, in this case, the interpreter converts all the floats to integers, performs the operation, and then converts the result back to a float before control is returned. Not totally pertinent to this question, but it helps to have a good understanding of Number handling in JS in general.

提交回复
热议问题