Why are preprocessor macros evil and what are the alternatives?

前端 未结 8 2038
面向向阳花
面向向阳花 2020-11-22 02:46

I have always asked this but I have never received a really good answer; I think that almost any programmer before even writing the first \"Hello World\" had encountered a p

相关标签:
8条回答
  • 2020-11-22 02:48

    The saying "macros are evil" usually refers to the use of #define, not #pragma.

    Specifically, the expression refers to these two cases:

    • defining magic numbers as macros

    • using macros to replace expressions

    with the new C++ 11 there is a real alternative after so many years ?

    Yes, for the items in the list above (magic numbers should be defined with const/constexpr and expressions should be defined with [normal/inline/template/inline template] functions.

    Here are some of the problems introduced by defining magic numbers as macros and replacind expressions with macros (instead of defining functions for evaluating those expressions):

    • when defining macros for magic numbers, the compiler retains no type information for the defined values. This can cause compilation warnings (and errors) and confuse people debugging the code.

    • when defining macros instead of functions, programmers using that code expect them to work like functions and they do not.

    Consider this code:

    #define max(a, b) ( ((a) > (b)) ? (a) : (b) )
    
    int a = 5;
    int b = 4;
    
    int c = max(++a, b);
    

    You would expect a and c to be 6 after the assignment to c (as it would, with using std::max instead of the macro). Instead, the code performs:

    int c = ( ((++a) ? (b)) ? (++a) : (b) ); // after this, c = a = 7
    

    On top of this, macros do not support namespaces, which means that defining macros in your code will limit the client code in what names they can use.

    This means that if you define the macro above (for max), you will no longer be able to #include <algorithm> in any of the code below, unless you explicitly write:

    #ifdef max
    #undef max
    #endif
    #include <algorithm>
    

    Having macros instead of variables / functions also means that you cannot take their address:

    • if a macro-as-constant evaluates to a magic number, you cannot pass it by address

    • for a macro-as-function, you cannot use it as a predicate or take the function's address or treat it as a functor.

    Edit: As an example, the correct alternative to the #define max above:

    template<typename T>
    inline T max(const T& a, const T& b)
    {
        return a > b ? a : b;
    }
    

    This does everything the macro does, with one limitation: if the types of the arguments are different, the template version forces you to be explicit (which actually leads to safer, more explicit code):

    int a = 0;
    double b = 1.;
    max(a, b);
    

    If this max is defined as a macro, the code will compile (with a warning).

    If this max is defined as a template function, the compiler will point out the ambiguity, and you have to say either max<int>(a, b) or max<double>(a, b) (and thus explicitly state your intent).

    0 讨论(0)
  • 2020-11-22 02:50

    I don't think that there is anything wrong with using preprocessor definitions or macros as you call them.

    They are a (meta) language concept found in c/c++ and like any other tool they can make your life easier if you know what you're doing. The trouble with macros is that they are processed before your c/c++ code and generate new code that can be faulty and cause compiler errors which are all but obvious. On the bright side they can help you keep your code clean and save you a lot of typing if used properly, so it comes down to personal preference.

    0 讨论(0)
  • 2020-11-22 02:52

    Macros in C/C++ can serve as an important tool for version control. Same code can be delivered to two clients with a minor configuration of Macros. I use things like

    #define IBM_AS_CLIENT
    #ifdef IBM_AS_CLIENT 
      #define SOME_VALUE1 X
      #define SOME_VALUE2 Y
    #else
      #define SOME_VALUE1 P
      #define SOME_VALUE2 Q
    #endif
    

    This kind of functionality is not so easily possible without macros. Macros are actually a great Software Configuration Management Tool and not just a way to create shortcuts for reuse of code. Defining functions for the purpose of reusability in macros can definitely create problems.

    0 讨论(0)
  • 2020-11-22 02:54

    I think that the problem is that macros are not well optimized by the compiler and are "ugly" to read and debug.

    Often a good alternatives are generic functions and/or inline functions.

    0 讨论(0)
  • 2020-11-22 03:00

    A common trouble is this :

    #define DIV(a,b) a / b
    
    printf("25 / (3+2) = %d", DIV(25,3+2));
    

    It will print 10, not 5, because the preprocessor will expand it this way:

    printf("25 / (3+2) = %d", 25 / 3 + 2);
    

    This version is safer:

    #define DIV(a,b) (a) / (b)
    
    0 讨论(0)
  • 2020-11-22 03:01

    Preprocessor macros are not evil when they are used for intended purposes like:

    • Creating different releases of the same software using #ifdef type of constructs, for example the release of windows for different regions.
    • For defining code testing related values.

    Alternatives- One can use some sort of configuration files in ini,xml,json format for similar purposes. But using them will have run time effects on code which a preprocessor macro can avoid.

    0 讨论(0)
提交回复
热议问题