Why not include all the standard headers always?

后端 未结 7 725
攒了一身酷
攒了一身酷 2021-01-03 19:02

I am reading Herb Sutter\'s More Exceptional C++ and item 37 on forward declarations says:

Never #include a

相关标签:
7条回答
  • 2021-01-03 19:34

    Slower compilation time

    This is the main reason.

    Even with precompiled headers the compiler has to do lots more work to include every single declaration in the standard library in every single translation unit of your project.

    If you have a large project consisting of hundreds of files then the compiler will be invoked hundreds of times, and each invocation must reload the entire standard library into memory.

    The compiler will use more memory to store all the declarations and will have to check against a larger set of names when doing name lookup (although a decent hash table implementation in the compiler should mean that doesn't significantly affect lookup time, only memory usage.)

    0 讨论(0)
  • 2021-01-03 19:36

    On your system, it might not cause much of a slowdown, but someone else might have a different experience.

    In the long run, computers will continue to get faster, and compilers will continue to get more efficient. The time saved obsessing over header files is certainly less than the incremental time spent waiting for the compiler, in most small projects.

    But (for an implementation that doesn't precompile or cache them) the cost will be multiplied across all the source files. This affects the speed of non-incremental builds.

    So for a library used over many sources or distributed across different platforms, it might still be a good idea to cut things out every so often, and before making a public release.

    0 讨论(0)
  • 2021-01-03 19:37

    This advice is some 10 years old. And somewhat dated by now. The speed of computers is up hundredsfold, storage went from G to T, insane amount of memory is sitting around idle.

    So advices from the past may degrade. You're on good track asking for reasons, and may be on better track if you eventually runs some experiments and make up your own opinion.

    Herb's item is way more general than your question. C++ (somewhat unfortunately) uses a file (translation unit) model for compilation -- not a source code repositiory/database. (Those who tried Visual Age C++ of IBM know how coll that was. ;) A consequence of that is that you pack together a lot of stuff. Include files are not one liners.

    So when you need to include a header to have a single declaration of something, you happen to drag in lot of other things.

    And those other things just to compile may drag in other stuff. And so on recursively. So if you can avoid the inclusion it saves not one but thousands of lines. and inclusion of maybe not one but a few dozen of files. A good way of economy. Also rebuild will be needed when any of those files change, regardless of the changes are likely be irrelevant for your stuff.

    Say your header uses pointers of 10 different classes. And you include all 10 headers defining them, instead of just prefixing the use with 'class'. It means any client that may only use a few really gets all ten dragged in as dependency. Not economic. In a project I worked a few years ago gtk++ was used. The .cpp files had just few hundreds of lines, but the preprocessor output was 800k or over million lines. No kidding. Though you pay a price in small redundancy: the thing may be class today but be something else (say typedef to a template). The _fwd.h idea mitigates that, but it really just centralizes the redundancy. In practice we seek some balance in tradeoffs.

    But all these ideas do not apply to things that are "stable" and ubiquitously used. In projects that put std:: to heavy and natural use you can see and many other headers included in every single source. Because they are used. And if some refactoring removed the last vector today it will likely grow back tomorrow.

    Here the choice is really just on the "where" the inclusion happens, and economy works the other way around. Setting up a "common" header that is used by everything removes a lot of noise from other files. Especially if the system have support for that case. In VS you have

    • precompiled headers, that allow one time compile of the common material and share the result with the other TUs
    • forced include, that allows your common header just specified in the project and not in source files
    • property sheets, that you include in the project file instead of using those settings manually

    With such support it may be perfectly feasible to put many, even most of standard headers in that common file, along with some using declarations for vector and other common names.

    Those who say you drag in many names that may cause conflicts are right -- but at the same time they are wrong for practice, as eventually someone will include that extra header and if the conflict exist it will topple the boat. Unless using std:: stuff is forbidden in a project I say it is just bad practice to reuse its common names for different purpose. Someone wants to check in code with his own class string and claims it is certainly distinct from std::string by the prefix, I call 'over my dead body'. While for rare names it's no big deal to sort out an accident.

    And what is good balance changes by projects and even inside projects as time passes.

    0 讨论(0)
  • 2021-01-03 19:43

    When a header changes, programs affected change. Changed programs need to be tested. Being selective minimized exposure and hence testing.

    0 讨论(0)
  • 2021-01-03 19:47

    There is in principle nothing against it.

    The only thing that will happen is that your compile times will increase, unless of course you create a precompiled header of that standard_library.h, in which case the impact will be minimal.

    Note that most people prefer to minimize their header dependencies. This mostly applies to your own header files, in which case a small change in an unused, but included header in a source file may trigger an unnecessary recompile of said source file for no reason whatsoever, slowing down incremental builds a lot.

    0 讨论(0)
  • 2021-01-03 19:49

    Oh! I know a good one.

    I have one proprietary library for making nice zip archive files out of memory data. It was designed to be multiplatform, but apparently not tested well enough on every platform including Windows.

    It works great on Linux and other POSIX systems but as I tried to adopt it in my project, I've stubled upon this: How to suppress #define locally?

    Both the library and winbase.h (included via the most standart windows.h) has a CreateFile entity. And, as in winbase it's just a macros, compiler don't see any problem, unless you actually try to use CreateFile in your code.

    So yes, keeping your namespace clean might be a good idea.

    0 讨论(0)
提交回复
热议问题