I tried
printf(\"%d, %d\\n\", sizeof(char), sizeof(\'c\'));
and got 1, 4 as output. If size of a character is one, why does \'c\'
In C 'a' is an integer constant (!?!), so 4 is correct for your architecture. It is implicitly converted to char for the assignment. sizeof(char) is always 1 by definition. The standard doesn't say what units 1 is, but it is often bytes.
Th C standard says that a character literal like 'a' is of type int, not type char. It therefore has (on your platform) sizeof == 4. See this question for a fuller discussion.
It is the normal behavior of the sizeof
operator (See Wikipedia):
sizeof
returns the size of the datatype. For char
, you get 1.sizeof
returns the size of the type of the variable or expression. As a character literal is typed as int
, you get 4.This is covered in ISO C11 6.4.4.4 Character constants
though it's largely unchanged from earlier standards. That states, in paragraph /10
:
An integer character constant has type int. The value of an integer character constant containing a single character that maps to a single-byte execution character is the numerical value of the representation of the mapped character interpreted as an integer.
According to the ANSI C standards, a char
gets promoted to an int
in the context where integers are used, you used a integer format specifier in the printf
hence the different values. A char is usually 1 byte but that is implementation defined based on the runtime and compiler.