I have a small question. I have this piece of code:
#include
int main(){
printf(\"%d, %f, %d\\n\", 0.9, 10, \'C\');
}
And
The only correct expectation is getting 67
for printing a character using %d
format specifier*. The other two printouts are undefined behavior.
it looks like the printf searches for the corresponding type in the expressions
This is only a coincidence. printf
has no idea of the types of the actual parameters that you pass. It trusts the format string, and interprets the data sequentially. You can tell what's going on by supplying different numbers, and observing how the output changes.
The numbers that you see is garbage - a double
re-interpreted as an int
, and an int
re-interpreted as a double
. Moreover, if the sizes of double
and int
are different, the first two parameters cross each others' boundaries.
To produce the output that you want add a cast to the first two parameter expressions:
printf("%d, %f, %d\n", (int)0.9, (double)10, 'C');
Note that you do not need to cast the last parameter, because char
is promoted to an int
as part of processing variable length argument list of printf
.
* This produces correct behavior only when there are no mismatches between other parameters and format specifiers; your program has UB even for the last parameter which would be correct if used by itself.