How should I detect bottleneck of compile time in a large C++ project?

后端 未结 4 1426
暖寄归人
暖寄归人 2021-02-18 15:41

I want to reduce compile time of a large C++ project. I tried to use precompiled headers, interface and etc. But before I move on, I want to know whether any tool which helps de

相关标签:
4条回答
  • 2021-02-18 15:55

    You could look into unity builds.
    Basically it's including all .cpp files into one .cpp file and only compiling that one file. I've tested it on a big project and it was really effective.
    It works because of it uses much less I/O when it includes all your headers/cpp's once and not for every cpp.

    Now we don't use unity builds anymore because we all got a SSD hardware upgrade, and they are just awesome.

    Here's a related SO question about Unity builds: #include all .cpp files into a single compilation unit?

    0 讨论(0)
  • 2021-02-18 15:57

    I am not aware of any tool for betterment of the compile time, but few manual remedies, I can suggest (consider this as comment):

    1. Have #include guards with every header file, so that multiple inclusions won't make any issues
    2. Reduce member function body, inline function body putting directly into the header files; just have their declarations
    3. Check if there are no unnecessary template function and classes; remember that tempaltes become inline by default. Too much of templates/metaprogramming cause huge compilation time.
    4. If number of #defines are unnecessarily high then they would increase preprocessing stage, which ultimately increases the compilation time
    0 讨论(0)
  • 2021-02-18 16:10

    C++ not being modular (yet), compilation bottlenecks are often due to include issues; that is using including too many files when they are not needed. It is also possible that those includes are needed at the moment, but could become superfluous with some simple reengineering.

    • to detect superfluous includes, you can check include-what-you-use, the only issue you'll have is that it works on top of Clang, so you'll need some setup there.
    • otherwise, you need to review your code, and specifically the headers.

    Since the tool is self-sufficient and documented, let me expand a bit on the review process.

    1. Any header that has more than a couple #include is highly suspicious.
    2. On the contrary, if you have a source file chock-full of various types and functions and it only has a couple includes, it probably means that one of the headers brings too much.

    If you have trouble knowing what is required, what is not, and how to remove superfluous headers, I recommend a reading of Pimpls - Beauty Marks You Can Depend On; if you do not know what a Pimpl is, read Compilation Firewalls. I would advise cautiousness though, Pimpl has a runtime and maintenance cost, so only use it when it really is necessary. Personally I would absolutely recommend it in the public headers of a library you deliver to 3rd party (ABI compatibility), and otherwise try to avoid it.

    If manual inspection is not your forte, you can generate the preprocessor output for each header (do not worry about source files too much), and check the bigger outputs.

    0 讨论(0)
  • one approach i like is to review the preprocessor output of a few of your sources -- just read some of it from the compiler's perspective rather than the somewhat abstracted representation that is #inclusion. chances are you will find some large chunks of includes/libraries you don't need and weren't necessarily aware of the existence (or need) of the dependency/include. from there, decide which dependencies can be removed. even if your dependencies were all correct, large outputs can also suggest how you might approach dividing larger modules into smaller pieces.

    0 讨论(0)
提交回复
热议问题