Why #define is bad? [duplicate]

安稳与你 提交于 2019-12-11 06:24:37

问题


Possible Duplicate:
When are C++ macros beneficial?
Why is #define bad and what is the proper substitute?

Someone has told me that #define is bad. Well, I honestly don't not understand why its bad. If its bad, then what other way can I do this then?

#include <iostream>
#define stop() cin.ignore(numeric_limits<streamsize>::max(), '\n');

回答1:


#define is not inherently bad. However, there are usually better ways of doing what you want. Consider an inline function:

inline void stop() {
    cin.ignore(numeric_limits<streamsize>::max(), '\n');
}

(Really, you don't even need inline for a function like that. Just a plain ordinary function would work just fine.)




回答2:


It's bad because it's indiscriminate. Anywhere you have stop() in your code will get replaced.

The way you solve it is by putting that code into its own method.




回答3:


In C++, using #define is not forcibly bad, although alternatives should be preferred. There are some context, such as include guards in which there is no other portable/standard alternative.

It should be avoided because the C preprocessor operates (as the name suggests) before the compiler. It performs simple textual replacement, without regard to other definitions. This means the result input to the compiler sometimes doesn't make sense. Consider:

// in some header file.
#define FOO 5

// in some source file.
int main ()
{
    // pre-compiles to: "int 5 = 2;"
    // the compiler will vomit a weird compiler error.
    int FOO = 2;
}

This example may seem trivial, but real examples exist. Some Windows SDK headers define:

#define min(a,b) ((a<b)?(a):(b))

And then code like:

#include <Windows.h>
#include <algorithm>
int main ()
{
    // pre-compiles to: "int i = std::((1<2)?(1):(2));"
    // the compiler will vomit a weird compiler error.
    int i = std::min(1, 2); 
}

When there are alternatives, use them. In the posted example, you can easily write:

void stop() {
    cin.ignore(numeric_limits<streamsize>::max(), '\n');
}

For constants, use real C++ constants:

// instead of
#define FOO 5

// prefer
static const int FOO = 5;

This will guarantee that your compiler sees the same thing you do and benefit you with name overrides in nested scopes (a local FOO variable will override the meaning of global FOO) as expected.




回答4:


It's not necessarily bad, it's just that most things people have used it for in the past can be done in a much better way.

For example, that snippet you provide (and other code macros) could be an inline function, something like (untested):

static inline void stop (void) {
    cin.ignore(numeric_limits<streamsize>::max(), '\n');
}

In addition, there are all the other things that code macros force you to do "macro gymnastics" for, such as if you wanted to call the very badly written:

#define f(x) x * x * x + x

with:

int y = f (a + 1);  // a + 1 * a + 1 * a + 1 + a + 1 (4a+2, not a^3+a)
int z = f (a++);    // a++ * a++ * a++ + a++

The first of those will totally surprise you with its results due to the precedence of operators, and the second will give you undefined behaviour. Inline functions do not suffer these problems.

The other major thing that macros are used for is for providing enumerated values such as:

#define ERR_OK    0
#define ERR_ARG   1
: :
#define ERR_MEM  99

and these are better done with enumerations.

The main problem with macros is that the substitution is done early in the translation phase, and information is often lost because of this. For example, a debugger generally doesn't know about ERR_ARG since it would have been substituted long before the part of the translation process that creates debugging information.

But, having maligned them enough, they're still useful for defining simple variables which can be used for conditional compilation. That's pretty much all I use them for in C++ nowadays.




回答5:


#define by itself is not bad, but it does have some bad properties to it. I'll list a few things that I know of:

"Functions" do not act as expected.

The following code seems reasonable:

#define getmax(a,b) (a > b ? a : b)

...but what happens if I call it as such?:

int a = 5;
int b = 2;
int c = getmax(++a,b);    // c equals 7.

No, that is not a typo. c will be equal to 7. If you don't believe me, try it. That alone should be enough to scare you.

The preprocessor is inherently global

Whenever you use a #define to define a function (such as stop()), it acts across ALL included files after being discovered.

What this means is that you can actually change libraries that you did not write. As long as they use the function stop() in the header file, you could change the behavior of code you didn't write and didn't modify.

Debugging is more difficult.

The preprocessor does symbolic replacement before the code ever makes it to the compiler. Thus if you have the following code:

#define NUM_CUSTOMERS 10
#define PRICE_PER_CUSTOMER 1.10

...

double something = NUM_CUSTOMERS * PRICE_PER_CUSTOMER;

if there is an error on that line, then you will NOT see the convenient variable names in the error message, but rather will see something like this:

double something = 10 * 1.10;

So that makes it more difficult to find things in code. In this example, it doesn't seem that bad, but if you really get into the habit of doing it, then you can run into some real headaches.



来源:https://stackoverflow.com/questions/7562278/why-define-is-bad

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!