Why are doubles printed differently in dictionaries?

后端 未结 2 1046
南笙
南笙 2020-12-03 11:32
let dic : [Double : Double] = [1.1 : 2.3, 2.3 : 1.1, 1.2 : 2.3]

print(dic)// [2.2999999999999998: 1.1000000000000001, 1.2: 2.2999999999999998, 1.1000000000000001: 2         


        
相关标签:
2条回答
  • 2020-12-03 11:43

    As already mentioned in the comments, a Double cannot store the value 1.1 exactly. Swift uses (like many other languages) binary floating point numbers according to the IEEE 754 standard.

    The closest number to 1.1 that can be represented as a Double is

    1.100000000000000088817841970012523233890533447265625
    

    and the closest number to 2.3 that can be represented as a Double is

    2.29999999999999982236431605997495353221893310546875
    

    Printing that number means that it is converted to a string with a decimal representation again, and that is done with different precision, depending on how you print the number.

    From the source code at HashedCollections.swift.gyb one can see that the description method of Dictionary uses debugPrint() for both keys and values, and debugPrint(x) prints the value of x.debugDescription (if x conforms to CustomDebugStringConvertible).

    On the other hand, print(x) calls x.description if x conforms to CustomStringConvertible.

    So what you see is the different output of description and debugDescription of Double:

    print(1.1.description) // 1.1
    print(1.1.debugDescription) // 1.1000000000000001
    

    From the Swift source code one can see that both use the swift_floatingPointToString() function in Stubs.cpp, with the Debug parameter set to false and true, respectively. This parameter controls the precision of the number to string conversion:

    int Precision = std::numeric_limits<T>::digits10;
    if (Debug) {
      Precision = std::numeric_limits<T>::max_digits10;
    }
    

    For the meaning of those constants, see std::numeric_limits:

    • digits10 – number of decimal digits that can be represented without change,
    • max_digits10 – number of decimal digits necessary to differentiate all values of this type.

    So description creates a string with less decimal digits. That string can be converted to a Double and back to a string giving the same result. debugDescription creates a string with more decimal digits, so that any two different floating point values will produce a different output.

    0 讨论(0)
  • 2020-12-03 11:52

    Yes, Swift uses binary floating numbers while storing it into dictionary

    Use dictionary as [Double: Any], use Float if your number is 32 bit then upcast to AnyObject

    See below example

        let strDecimalNumber  = "8.37"    
        var myDictionary : [String: Any] = [:] 
        myDictionary["key1"] = Float(strDecimalNumber) as AnyObject  // 8.369999999999999
        myDictionary["key2"] = Double(strDecimalNumber) as AnyObject  //8.369999999999999
        myDictionary["key3"] = Double(8.37) as AnyObject   //8.369999999999999
        myDictionary["key4"] = Float(8.37) as AnyObject  //8.37
        myDictionary["key5"] = 8.37  // 8.3699999999999992
        myDictionary["key6"] = strDecimalNumber  // "8.37" it is String
        myDictionary["key7"] = strDecimalNumber.description  // "8.37" it is String
        myDictionary["key8"] = Float(10000000.01)  // 10000000.0
        myDictionary["key9"] = Float(100000000.01) // 100000000.0
        myDictionary["key10"] = Float(1000000000.01) // 1e+09 
        myDictionary["key11"] = Double(1000000000.01) // 1000000000.01
        print(myDictionary)
    

    myDictionary will be printed as

    ["key1": 8.37 , "key2": 8.369999999999999, "key3": 8.369999999999999, "key4": 8.37, "key5": 8.3699999999999992, "key6": "8.37", "key7": "8.37" , "key8": 10000000.0, "key9": 100000000.0, "key10": 1e+09 ,"key11": 1000000000.01]

    As mentioned by Martin R in above answer using .description will be treated as String not actual Float

    0 讨论(0)
提交回复
热议问题