Why does #define not require a semicolon?

前端 未结 7 966
无人及你
无人及你 2020-11-27 21:03

I was writing some test code in C. By mistake I had inserted a ; after a #define, which gave me errors. Why is a semicolon not req

相关标签:
7条回答
  • 2020-11-27 21:52
    #define MAX_STRING 256;
    

    means:

    whenever you find MAX_STRING when preprocessing, replace it with 256;. In your case it'll make method 2:

    #include <stdio.h>
    #include <stdlib.h>
    #define MAX_STRING 256;
    
    int main(void) {
        char buffer [256;];
    }
    

    which isn't valid syntax. Replace

    #define MAX_STRING 256;
    

    with

    #define MAX_STRING 256
    

    The difference between your two codes is that in first method you declare a constant equal to 256 but in the second code you define MAX_STRING to stand for 256; in your source file.

    The #define directive is used to define values or macros that are used by the preprocessor to manipulate the program source code before it is compiled. Because preprocessor definitions are substituted before the compiler acts on the source code, any errors that are introduced by #define are difficult to trace.

    The syntax is:

    #define CONST_NAME VALUE
    

    if there is a ; at the end, it's considered as a part of VALUE.

    to understand how exactly #defines work, try defining:

    #define FOREVER for(;;)
    ...
        FOREVER {
             /perform something forever.
        }
    

    Interesting remark by John Hascall:

    Most compilers will give you a way to see the output after the preprocessor phase, this can aid with debugging issues like this.

    In gcc it can be done with flag -E.

    0 讨论(0)
提交回复
热议问题