I\'m curious:
If you do a printf(\"%f\", number); what is the precision of the statement? I.e. How many decimal places will show up? Is this compiler de
printf(\"%f\", number);
The book, C: A Reference Manual states that if no precision is specified then the default precision is 6 (i.e. 6 digits after the decimal point).
One caveat is if the number is inf (i.e. 1.0/0.0) or NaN then C99 specifies that the output should be inf, -inf, infinity, -infinity, or nan etc....