My function is converting a string to Decimal
func getDecimalFromString(_ strValue: String) -> NSDecimalNumber {
let formatter = NumberFormatter()
That seems to be a bug, compare
Even with generatesDecimalNumbers
set to true
, the formatter
produces a binary floating point number internally, and that cannot
represent decimal fractions like 8.2
precisely.
Note also that (contrary to the documentation), the maximumFractionDigits
property has no effect when parsing a string
into a number.
There is a simple solution: Use
NSDecimalNumber(string: strValue) // or
NSDecimalNumber(string: strValue, locale: Locale.current)
instead, depending on whether the string is localized or not.
Or with the Swift 3 Decimal
type:
Decimal(string: strValue) // or
Decimal(string: strValue, locale: .current)
Example:
if let d = Decimal(string: "8.2") {
print(d) // 8.2
}