This advice is some 10 years old. And somewhat dated by now. The speed of computers is up hundredsfold, storage went from G to T, insane amount of memory is sitting around idle.
So advices from the past may degrade. You're on good track asking for reasons, and may be on better track if you eventually runs some experiments and make up your own opinion.
Herb's item is way more general than your question. C++ (somewhat unfortunately) uses a file (translation unit) model for compilation -- not a source code repositiory/database. (Those who tried Visual Age C++ of IBM know how coll that was. ;) A consequence of that is that you pack together a lot of stuff. Include files are not one liners.
So when you need to include a header to have a single declaration of something, you happen
to drag in lot of other things.
And those other things just to compile may drag in other stuff. And so on recursively. So if you can avoid the inclusion it saves not one but thousands of lines. and inclusion of maybe not one but a few dozen of files. A good way of economy. Also rebuild will be needed when any of those files change, regardless of the changes are likely be irrelevant for your stuff.
Say your header uses pointers of 10 different classes. And you include all 10 headers defining them, instead of just prefixing the use with 'class'. It means any client that may only use a few really gets all ten dragged in as dependency. Not economic. In a project I worked a few years ago gtk++ was used. The .cpp files had just few hundreds of lines, but the preprocessor output was 800k or over million lines. No kidding.
Though you pay a price in small redundancy: the thing may be class today but be something else (say typedef to a template). The _fwd.h idea mitigates that, but it really just centralizes the redundancy. In practice we seek some balance in tradeoffs.
But all these ideas do not apply to things that are "stable" and ubiquitously used. In projects that put std:: to heavy and natural use you can see and many other headers included in every single source. Because they are used. And if some refactoring removed the last vector today it will likely grow back tomorrow.
Here the choice is really just on the "where" the inclusion happens, and economy works the other way around. Setting up a "common" header that is used by everything removes a lot of noise from other files. Especially if the system have support for that case. In VS you have
- precompiled headers, that allow one time compile of the common material and share the result with the other TUs
- forced include, that allows your common header just specified in the project and not in source files
- property sheets, that you include in the project file instead of using those settings manually
With such support it may be perfectly feasible to put many, even most of standard headers in that common file, along with some using declarations for vector and other common names.
Those who say you drag in many names that may cause conflicts are right -- but at the same time they are wrong for practice, as eventually someone will include that extra header and if the conflict exist it will topple the boat. Unless using std:: stuff is forbidden in a project I say it is just bad practice to reuse its common names for different purpose. Someone wants to check in code with his own class string and claims it is certainly distinct from std::string by the prefix, I call 'over my dead body'. While for rare names it's no big deal to sort out an accident.
And what is good balance changes by projects and even inside projects as time passes.