Why does integer division in C# return an integer and not a float?

前端 未结 8 880
隐瞒了意图╮
隐瞒了意图╮ 2020-11-21 04:32

Does anyone know why integer division in C# returns an integer and not a float? What is the idea behind it? (Is it only a legacy of C/C++?)

In C#:

fl         


        
8条回答
  •  太阳男子
    2020-11-21 05:10

    While it is common for new programmer to make this mistake of performing integer division when they actually meant to use floating point division, in actual practice integer division is a very common operation. If you are assuming that people rarely use it, and that every time you do division you'll always need to remember to cast to floating points, you are mistaken.

    First off, integer division is quite a bit faster, so if you only need a whole number result, one would want to use the more efficient algorithm.

    Secondly, there are a number of algorithms that use integer division, and if the result of division was always a floating point number you would be forced to round the result every time. One example off of the top of my head is changing the base of a number. Calculating each digit involves the integer division of a number along with the remainder, rather than the floating point division of the number.

    Because of these (and other related) reasons, integer division results in an integer. If you want to get the floating point division of two integers you'll just need to remember to cast one to a double/float/decimal.

提交回复
热议问题