Does using large libraries inherently make slower code?

前端 未结 17 1569
情深已故
情深已故 2021-02-06 20:40

I have a psychological tic which makes me reluctant to use large libraries (like GLib or Boost) in lower-level languages like C and C++. In my mind, I think:

17条回答
  •  醉酒成梦
    2021-02-06 21:23

    Depends on how the linker works. Some linkers are lazy and will include all the code in library. The more efficient linkers will only extract the needed code from a library. I have had experience with both types.

    Smaller libraries will have less worries with either type of linker. Worst case with a small library is small amounts of unused code. Many small libraries may increase the build time. The trade off would be build time vs. code space.

    An interesting test of the linker is the classic Hello World program:

    #include 
    #include 
    int main(void)
    {
      printf("Hello World\n");
      return EXIT_SUCCESS;
    }
    

    The printf function has a lot of dependencies due to all the formatting that it may need. A lazy, but fast linker may include a "standard library" to resolve all the symbols. A more efficient library will only include printf and its dependencies. This makes the linker slower.

    The above program can be compared to this one using puts:

    #include 
    #include 
    int main(void)
    {
      puts("Hello World\n");
      return EXIT_SUCCESS;
    }
    

    Generally, the puts version should be smaller than the printf version, because puts has no formatting needs thus less dependencies. Lazy linkers will generate the same code size as the printf program.

    In summary, library size decisions have more dependencies on the linker. Specifically, the efficiency of the linker. When in doubt, many small libraries will rely less on the efficiency of the linker, but make the build process more complicated and slower.

提交回复
热议问题