How does a C compiler interpret the \"L\" which denotes a long integer literal, in light of automatic conversion? The following code, when run on a 32-bit platform (32-bit long
It's a hexadecimal literal, so its type can be unsigned. It fits in unsigned long
, so that's the type it gets. See section 6.4.4.1 of the standard:
The type of an integer constant is the first of the corresponding list in which its value can be represented.
where the list for hexadecimal literals with a suffix L
is
long
unsigned long
long long
unsigned long long
Since it doesn't fit in a 32-bit signed long
, but an unsigned 32-bit unsigned long
, that's what it becomes.
The thing is that the rules of determining the type of the integral literal are different depending on whether you have a decimal number or a hexadecimal(or octal number). A decimal literal is always signed unless postfixes with U. A hexadecimal or octal literal can also be unsigned if the signed type can not contain the value.