问题
The C/C++ standard reserves all identifiers that either lead with an underscore (plus an uppercase letter if not in the global namespace) or contain two or more adjacent underscores. Example:
int _myGlobal;
namespace _mine
{
void Im__outta__control() {}
int _LivingDangerously;
}
But what if I just don't care? What if I decide to live dangerously and use these "reserved" identifiers anyway? Just how dangerously would I be living?
Have you ever actually seen a compiler or linker problem resulting from the use of reserved identifiers by user code?
The answers below, so far, amount to, "Why break the rules when doing so might cause trouble?" But imagine that you already had a body of code that broke the rules. At what point would the cost of trouble from breaking the rules outweigh the cost of refactoring the code to comply? Or what if a programmer had developed a personal coding style that called for wild underscores (perhaps by coming from another language, for instance)? Assuming that changing their coding style was more or less painful to them, what would motivate them to overcome the pain?
Or I could ask the same question in reverse. What is it concretely that C/C++ libraries are doing with reserved words that a user is liable to fall afoul of? Are they declaring globals that might create name clashes? Functions? Classes? Each library is different, naturally, but how in general might this collision manifest?
I teach software students who come to me with these kinds of questions, and all I can tell them is, "It's against the rules." It's a superstitious, hand-waving answer. Moreover, in twenty years of C++ programming, I've never seen a compiler or linker error that resulted from breaking the reserved word rules.
A good skeptic, faced with any rule, asks, "Why should I care?" So: why should I care?
回答1:
The results may vary according to the specific complier you will use. Regarding the "danger level" - every time you'll get a bug - you will have to wonder if it is originates from your implemented logic or from the fact you are not using the standard.
But that is not all... let's assume someone tells you: "it is perfectly safe!" So, you can do that with no problem at all (only assuming..) Will it redefine your thinking when you get to a bug or still you will be wondering if there is a slight chace he was wrong? :)
So, you see, no matter which answer you will get it can never be a good one. (which makes me actually like your question)
回答2:
I now care because I just encountered a failure with underscores, large and old codebase, mostly aimed at Windows and compiled with VS2005 but some is also cross-compiled to Linux. While analyzing updates to a newer gcc, I rebuilt some under cygwin just for ease of syntax checking. I got totally unintelligible errors (to my tiny brain) out of a line like:
template< size_t _N = 0 > class CSomeForwardRef;
That produced an error like:
error: expected ‘>’ before numeric constant
Google on that error turned up https://svn.boost.org/trac/boost/ticket/2245 and https://svn.boost.org/trac/boost/ticket/7203 both of which hinted that a stray #define
could get in the way. Sure enough, an examination of the preprocessed source via -E
and a hunt thru the include path turned up some bits-related .h (forget which) that defined _N
. Later in that same odyssey I encountered a similar problem with _L
.
Edit: Not bits-related but char-related: /usr/include/ctype.h
-- here are some samples together with how ctype.h
uses them:
#define _L 02
#define _N 04
.
.
.
#define isalpha(__c) (__ctype_lookup(__c)&(_U|_L))
#define isupper(__c) ((__ctype_lookup(__c)&(_U|_L))==_U)
#define islower(__c) ((__ctype_lookup(__c)&(_U|_L))==_L)
#define isdigit(__c) (__ctype_lookup(__c)&_N)
#define isxdigit(__c) (__ctype_lookup(__c)&(_X|_N))
I'll be scanning the source for all underscored identifiers and weeding out via rename all those we created in error ...
Jon
回答3:
Just how dangerously would I be living?
Dangerous enough to break your code in next compiler upgrade.
Think of the future, your code might not be portable and might break in future because future enhancement releases from your implementation might have exactly the same symbol name as you use.
Since the question has a pinch of: "It can be wrong yet how wrong can it be and when ever has it been wrong" flavor, I think Murphy's law answers this rather aptly:
"Anything that can go wrong will go wrong (When you are least expecting it)".[#]
[#] The (
,)
is my invention not Murphy's.
回答4:
If you try to build your code somewhere where there's actually a conflict you will see strange build errors, or worse, no build error at all and incorrect runtime behavior.
I have seen someone use a reserved identifier which had to be changed when it caused build problems on a new platform.
It's not all that likely, but there's no reason to do it.
来源:https://stackoverflow.com/questions/9996909/does-using-leading-underscores-actually-cause-trouble