Why does integer division in C# return an integer and not a float?

前端 未结 8 906
隐瞒了意图╮
隐瞒了意图╮ 2020-11-21 04:32

Does anyone know why integer division in C# returns an integer and not a float? What is the idea behind it? (Is it only a legacy of C/C++?)

In C#:

fl         


        
8条回答
  •  长发绾君心
    2020-11-21 05:21

    The result will always be of type that has the greater range of the numerator and the denominator. The exceptions are byte and short, which produce int (Int32).

    var a = (byte)5 / (byte)2;  // 2 (Int32)
    var b = (short)5 / (byte)2; // 2 (Int32)
    var c = 5 / 2;              // 2 (Int32)
    var d = 5 / 2U;             // 2 (UInt32)
    var e = 5L / 2U;            // 2 (Int64)
    var f = 5L / 2UL;           // 2 (UInt64)
    var g = 5F / 2UL;           // 2.5 (Single/float)
    var h = 5F / 2D;            // 2.5 (Double)
    var i = 5.0 / 2F;           // 2.5 (Double)
    var j = 5M / 2;             // 2.5 (Decimal)
    var k = 5M / 2F;            // Not allowed
    

    There is no implicit conversion between floating-point types and the decimal type, so division between them is not allowed. You have to explicitly cast and decide which one you want (Decimal has more precision and a smaller range compared to floating-point types).

提交回复
热议问题