Why does integer division in C# return an integer and not a float?

前端 未结 8 876
隐瞒了意图╮
隐瞒了意图╮ 2020-11-21 04:32

Does anyone know why integer division in C# returns an integer and not a float? What is the idea behind it? (Is it only a legacy of C/C++?)

In C#:

fl         


        
相关标签:
8条回答
  • 2020-11-21 05:24

    As a little trick to know what you are obtaining you can use var, so the compiler will tell you the type to expect:

    int a = 1;
    int b = 2;
    var result = a/b;
    

    your compiler will tell you that result would be of type int here.

    0 讨论(0)
  • 2020-11-21 05:27

    Each data type is capable of overloading each operator. If both the numerator and the denominator are integers, the integer type will perform the division operation and it will return an integer type. If you want floating point division, you must cast one or more of the number to floating point types before dividing them. For instance:

    int x = 13;
    int y = 4;
    float x = (float)y / (float)z;
    

    or, if you are using literals:

    float x = 13f / 4f;
    

    Keep in mind, floating points are not precise. If you care about precision, use something like the decimal type, instead.

    0 讨论(0)
提交回复
热议问题