It feels like I am missing something obvious here. This has been asked a number of times - and the answer usually boils down to:
var num = 4.5;
num % 1 === 0; /
Those numbers aren't actually decimals or integers. They're all floats. The only real difference between 1
and 1.0
is the notation that was used to create floats of equal values.
1 === 1.0; // true
parseInt('1') == parseInt('1.0'); // true
parseFloat('1') === parseFloat('1.0'); // true
parseInt('1') === parseFloat('1'); // true
// etc...
Also, to demonstrate that they are really the same underlying data type:
typeof(1); // 'number'
typeof(1.0); // 'number'
Also, note that 'number' isn't unambiguous in JavaScript like it would be in other languages, because numbers are always floats.