Simply put in C and variants (unlike that wuss java with its virtual machine) the size of primitive types on different targets can vary greatly, and there is really no guarantee
sizeof
always returns size as the number of bytes. But according to wikipedia:
In the programming languages C and C++, the unary operator sizeof is used to calculate the size of any datatype, measured in the number of bytes required to represent the type. A byte in this context is the same as an unsigned char, and may be larger than 8 bits, although that is uncommon.