Why doesn't GCC optimize this call to printf?

后端 未结 3 2038
萌比男神i
萌比男神i 2021-01-17 10:12
#include 
int main(void) { 
    int i;
    scanf(\"%d\", &i);
    if(i != 30) { return(0); } 
    printf(\"i is equal to %d\\n\", i);
}

3条回答
  •  说谎
    说谎 (楼主)
    2021-01-17 10:48

    Not sure if this is a convincing answer, but I would expect that compilers shouldn't optimize printf("%d\n", 10) case to puts("10") .

    Why? Because this case could be more complicated than you think. Here are some of the problems I can think of at the moment:

    1. Convert binary numbers to ASCII increases size of string literal, and thus overall code size. Although this is irrelevant to small numbers, but if it's printf("some number: %d", 10000) ---- 5 digits or more (assuming int is 32-bit), the string size increased will beat the size saved for the integer, and some people could consider this a drawback. Yes, with the conversion I saved a "push to stack" instruction, but how many bytes the instruction is and how many would be saved is architecture-specific. It's non-trivial for a compiler to say if it's worth it.

    2. Padding, if used in formats, can also increase size of expanded string literal. Example: printf("some number: %10d", 100)

    3. Sometimes I the developer would share a format string among printf calls, for code size reasons:

      printf("%-8s: %4d\n", "foo", 100);
      printf("%-8s: %4d\n", "bar", 500);
      printf("%-8s: %4d\n", "baz", 1000);
      printf("%-8s: %4d\n", "something", 10000);
      

      Converting them to different string literals might lose the size advantage.

    4. For %f, %e, and %g, there is a problem that decimal point "." is locale-dependent. Hence the compiler cannot expand it to string constant for you. Although we are only discussing about %d I mention this here for completeness.

提交回复
热议问题