Simply put in C and variants (unlike that wuss java with its virtual machine) the size of primitive types on different targets can vary greatly, and there is really no guarantee
From my own experience, working on embedded micro controllers with exotic 'C' compilers, I have seen:
sizeof( uint8 )
return 1
sizeof( uint16 )
return 1
sizeof( uint32 )
return 2
Clearly, I was dealing with a machine were the smallest addressable entity was 16 bit, so the sizeof does not comply with C89 or C99.
I would say that on mainstream C89 & C99 compliant systems, the accepted answer is correct. Unfortunately, a "C" compiler can still be called a "C" compiler, even if it does not comply to a 25 year old standard. I hope this answer help put things in perspective, given more exotic systems.
Answer: sizeof
returns the size of the type in bytes.
Example: sizeof(char)
is 100% guaranteed to be 1
, but this does not mean, that it's one octet (8 bits).
Proved by the standard:
in 6.5.3.4, point 2:
The sizeof operator yields the size (in bytes) of its operand, which may be an expression or the parenthesized name of a type. The size is determined from the type of the operand. The result is an integer. If the type of the operand is a variable length array type, the operand is evaluated; otherwise, the operand is not evaluated and the result is an integer constant.
...
When applied to an operand that has type char, unsigned char, or signed char, (or a qualified version thereof) the result is 1. When applied to an operand that has array type, the result is the total number of bytes in the array) When applied to an operand that has structure or union type, the result is the total number of bytes in such an object, including internal and trailing padding.
Also, in Section 3.6, point 3:
A byte is composed of a contiguous sequence of bits, the number of which is implementation-defined
sizeof
gives the size in bytes. However, note that "byte" is a technical term in the C standard, and is defined such that sizeof(char) == 1
.
sizeof
always returns size as the number of bytes. But according to wikipedia:
In the programming languages C and C++, the unary operator sizeof is used to calculate the size of any datatype, measured in the number of bytes required to represent the type. A byte in this context is the same as an unsigned char, and may be larger than 8 bits, although that is uncommon.