It feels like I am missing something obvious here. This has been asked a number of times - and the answer usually boils down to:
var num = 4.5;
num % 1 === 0; /
var num = 1;
and
var num = 1.0;
are the same. You mention that you want to treat them differently when given from a user. You will want to parse the difference when it is still a string and convert it to the appropriate number.
You'll have to do it when parsing the string. If there's a decimal point in the string, treat it as percentage, otherwise it's just the integer value.
So, e.g.:
rgb 1 1 1 // same as #010101
rgb 1.0 1 1 // same as #ff0101
Since the rgb
is there, you're parsing the string anyway. Just look for .
in there as you're doing it.
Those numbers aren't actually decimals or integers. They're all floats. The only real difference between 1
and 1.0
is the notation that was used to create floats of equal values.
1 === 1.0; // true
parseInt('1') == parseInt('1.0'); // true
parseFloat('1') === parseFloat('1.0'); // true
parseInt('1') === parseFloat('1'); // true
// etc...
Also, to demonstrate that they are really the same underlying data type:
typeof(1); // 'number'
typeof(1.0); // 'number'
Also, note that 'number' isn't unambiguous in JavaScript like it would be in other languages, because numbers are always floats.
Well, as far as the compiler is concerned, there is no difference between 1.0 and 1, and because there is no difference, it is impossible to tell the difference between them. You should change it from 1.0 to 100 for the the percentage thing. That might fix it.
Let your script parse the input as string, then it will be a matter of checking if there is the point like this.
mystring.indexOf('.');
Check this example and this example.