Herb Sutter\'s C++ coding standards says to avoid Premature optimization
and Premature pessimization
. But I feel both is doing the same th
I'd tend to think that premature pessimization is simply the misinterpretation of performance requirements that leads to premature opimization. i.e. you incorrectly assume your code will not perform fast enough or use too many resources (pessimism) so you optimize where it is not necessary.
With the advent of more and many huge datasets, I tend to see to reverse more often, i.e. lack of sufficient pessimism leading to selection of algorithms that will not scale to meet user requirements. This is often coupled with the belief that compiler optimization is some kind of substitute for poor algorithm selection.