c# float [] average loses accuracy

后端 未结 3 1552
遇见更好的自我
遇见更好的自我 2021-01-21 08:01

I am trying to calculate average for an array of floats. I need to use indices because this is inside a binary search so the top and bottom will move. (Big picture we are trying

3条回答
  •  囚心锁ツ
    2021-01-21 08:55

    I'm getting 2 places less accuracy than the c# Average()

    No, you are only losing 1 significant digit. The float type can only store 7 significant digits, the rest are just random noise. Inevitably in a calculation like this, you can accumulate round-off error and thus lose precision. Getting the round-off errors to balance out requires luck.

    The only way to avoid it is to use a floating point type that has more precision to accumulate the result. Not an issue, you have double available. Which is why the Linq Average method looks like this:

       public static float Average(this IEnumerable source) {
           if (source == null) throw Error.ArgumentNull("source");
           double sum = 0;         // <=== NOTE: double
           long count = 0;
           checked {
               foreach (float v in source) {
                   sum += v;
                   count++;
               }
           }
           if (count > 0) return (float)(sum / count);
           throw Error.NoElements();
       }
    

    Use double to reproduce the Linq result with a comparable number of significant digits in the result.

提交回复
热议问题