My questions stem from trying to use printf to log things when trying to build for multiple bit-depth platforms (32/64 for example).
A problem that keeps rearing its ugl
bools
/_Bools
, chars
and shorts
are first converted into int
(if this conversion preserves the value, else into unsigned int
) when passed to variadic functions like printf()
. Similarly floats
get converted into doubles
.
So, if you pass something smaller than int
, printf()
will grab the whole (unsigned) int
without any problems (other than if the passed value is actually an unsigned int
and you're printing it with %d
instead of %u
, you get undefined behavior).
Other types, AFAIR, do not undergo such conversions.
This line:
print (" my int: %ld\n", (long)myInt);
isn't buying you anything over this line:
printf(" my int: %d\n", myInt);
Both are valid and the result will be practically identical. The only difference is that the former might result in bigger code and longer execution time (if sizeof(long) >= sizeof(int)
).