When terminating a string, it seems to me that logically char c=0
is equivalent to char c='\0'
, since the "null" (ASCII 0) byte is 0
, but usually people tend to do '\0'
instead. Is this purely out of preference or should it be a better "practice"?
What is the preferred choice?
EDIT: K&R says: "The character constant '\0'
represents the character with value zero, the null character. '\0'
is often written instead of 0
to emphasize the character nature of some expression, but the numeric value is just 0
.
http://en.wikipedia.org/wiki/Ascii#ASCII_control_code_chart
Binary Oct Dec Hex Abbr Unicode Control char C Escape code Name
0000000 000 0 00 NUL ␀ ^@ \0 Null character
There's no difference, but the more idiomatic one is '\0'
.
Putting it down as char c = 0;
could mean that you intend to use it as a number (e.g. a counter). '\0'
is unambiguous.
'\0'
is just an ASCII character. The same as 'A'
, or '0'
or '\n'
If you write char c = '\0
', it's the same aschar c = 0;
If you write char c = 'A'
, it's the same as char c = 65
It's just a character representation and it's a good practice to write it, when you really mean the NULL byte of string. Since char
is in C one byte (integral type), it doesn't have any special meaning.
Preferred choice is that which can give people reading your code an ability to understand how do you use your variable - as a number or as a character. Best practice is to use 0 when you mean you variable as a number and to use '\0' when you mean your variable is a character.
来源:https://stackoverflow.com/questions/16955936/string-termination-char-c-0-vs-char-c-0