Does using large libraries inherently make slower code?

前端 未结 17 1545
情深已故
情深已故 2021-02-06 20:40

I have a psychological tic which makes me reluctant to use large libraries (like GLib or Boost) in lower-level languages like C and C++. In my mind, I think:

17条回答
  •  粉色の甜心
    2021-02-06 21:37

    Technically, the answer is that yes, they do. However, these inefficiencies are very seldom practically important. I'm going to assume a statically compiled language like C, C++, or D here.

    When an executable is loaded into memory on a modern OS, address space is simply mapped to it. This means that, no matter how big the exectable is, if there are entire page-size blocks of code that aren't used, they will never touch physical memory. You will waste address space, though, and occasionally this can matter a little on 32-bit systems.

    When you link to a library, a good linker will usually throw out excess stuff that you don't use, though especially in the case of template instantiations this doesn't always happen. Thus your binaries might be a little bit bigger than strictly necessary.

    If you have code that you don't use heavily interleaved with code that you do use, you can end up wasting space in your CPU cache. However, as cache lines are small (usually 64 bytes), this will seldom happen to a practically important extent.

提交回复
热议问题