In C, there appear to be differences between various values of zero -- NULL
, NUL
and 0
.
I know that the ASCII character
NULL
is not guaranteed to be 0 -- its exact value is architecture-dependent. Most major architectures define it to (void*)0
.
'\0'
will always equal 0, because that is how byte 0 is encoded in a character literal.
I don't remember whether C compilers are required to use ASCII -- if not, '0'
might not always equal 48. Regardless, it's unlikely you'll ever encounter a system which uses an alternative character set like EBCDIC unless you're working on very obscure systems.
The sizes of the various types will differ on 64-bit systems, but the integer values will be the same.
Some commenters have expressed doubt that NULL be equal to 0, but not be zero. Here is an example program, along with expected output on such a system:
#include
int main () {
size_t ii;
int *ptr = NULL;
unsigned long *null_value = (unsigned long *)&ptr;
if (NULL == 0) {
printf ("NULL == 0\n"); }
printf ("NULL = 0x");
for (ii = 0; ii < sizeof (ptr); ii++) {
printf ("%02X", null_value[ii]); }
printf ("\n");
return 0;
}
That program could print:
NULL == 0
NULL = 0x00000001