I thought the book was a good read back in the late 1990s. It was out of date back then, now about as relevant as telephone book from the 1950s.
I worked with Lakos for several years where he tried to implement these ideas. I also had my modern C++ project I built of about 2000 source files-- large scale enough, and I've built a few more since. Build times are easily reduced by parallel distributed builds, using programs like icecream. Every complier will cache opened headers-- only by doing something really outrageous any modern build tool like scons will scale to thousands of translation units without doing anything really special, with build times, including tests, takes a couple minutes from clean.
Limiting your libraries interface to a few "vocabulary types" (not the same sence as the OO concept, he means just use Int, float, string, char, and a couple of other I don't recall) is extremely poor advice to scale anything. Your build is going to be looking for a type mismatch, you don't want this at runtime just to make it easier to write a make file.
And that's largely the gist of the book-- how do I write C++ in order that it works with tools from 1993? Additionally, the Mentor Graphic project the book describes was a disaster from all accounts. Even the book itself alludes to that.
Large Scale C++ is delivered in source code form-- these are the "physical dependencies," not link libraries (and the book never mentions other more important dependencies like databases, sockets, processor architecture, etc.).
The book is full of ideas that sound good, but there are much better ways to scale up in practice. If you want to study large C++ libraries, take a look at Boost or Xerces and see what they did. They really are implementing large projects, not writing about them.