问题
I have:
var a = 0.0532;
var b = a * 100;
b should be returning 5.32 but instead it's returning 5.319999999999999. How do I fix this?
JSFiddle here: http://jsfiddle.net/9f2K8/
回答1:
you should use .toFixed()
FIDDLE
var a = 0.0532;
var b = a * 100;
b.toFixed(2); //specify number of decimals to be displayed
回答2:
This is not an error.
Very simply put, javascript is trying to represent 5.32
with as much precision as possible. Since computers don't have infinite precision, it picks the number that is closest: 5.319999999999999
.
An error margin of 0.0000000000001 should be more than acceptable, no?
You can always round the number to less decimal digits, or use toFixed()
to create a rounded string:
number = 123.4567
number.toFixed(2)
> '123.46'
回答3:
You can use ToFixed
to round it back:
b = b.ToFixed(2);
来源:https://stackoverflow.com/questions/21472828/js-multiplying-by-100-giving-wierd-result