In C# Is there any difference in the accuracy of the two decimal rounding strategies MidpointRounding.ToEven
and MidpointRounding.AwayFromZero
? I mean do both ensure an even distribution amongst the numbers that are rounded to, or is one rounding strategy over representing the rounded numbers compared to the other?
From MSDN:
By default, Math.Round uses MidpointRounding.ToEven. Most people are not familiar with "rounding to even" as the alternative, "rounding away from zero" is more commonly taught in school. .NET defaults to "Rounding to even" as it is statistically superior because it doesn't share the tendency of "rounding away from zero" to round up slightly more often than it rounds down (assuming the numbers being rounded tend to be positive.)
Depending on the data set, symmetric arithmetic rounding can introduce a major bias, since it always rounds midpoint values upward. To take a simple example, suppose that we want to determine the mean of three values, 1.5, 2.5, and 3.5, but that we want to first round them to the nearest integer before calculating their mean. Note that the true mean of these values is is 2.5. Using symmetic arithmetic rounding, these values change to 2, 3, and 4, and their mean is 3. Using bankers rounding, these values change to 2, 2, and 4, and their mean is 2.67. Because the latter rounding method is much closer to the true mean of the three values, it provides the least loss of data.
http://msdn.microsoft.com/en-us/library/system.math.round.aspx
If your value is 123.45 then
123.5 <-- MidpointRounding.AwayFromZero
123.4 <-- MidpointRounding.ToEven
来源:https://stackoverflow.com/questions/7360432/c-sharp-rounding-midpointrounding-toeven-vs-midpointrounding-awayfromzero