The standardization of the language in the late 1990s was the first step, it allowed the compiler makers to focus on the "standard" set of features, also allowed the language to fix some of the rough edges, which appeared trough the standardization process.
This in turn allowed development of frameworks based on standard features of the language, and not on features provided by a particular compiler implementation. The Boost library is notably in this regard. Also this permitted that new development is based on previous work, thus making possible solutions to more complex problems.
A notable change here is how previously frameworks were based on the notion of base classes and derivated classes (a run time feature). But now most advanced features often are heavily based on "recursive" templates (a compile time feature).
The STL has its pros and cons but it survived the test of time, if you want something that works and is simple STL surely has something to help you start. There's no point in reinventing the wheel (unless for didactic reasons).
Computer hardware has also made great leaps from the 1990s, then the memory and CPU are no longer a constraint for the compiler. So most of the theoretical optimizations from books are now possible.
The next steps of the language is the support of multi-core programming, which is part of 0x standard effort.