Herb Sutter\'s C++ coding standards says to avoid Premature optimization
and Premature pessimization
. But I feel both is doing the same th
There are both small and large scale choices to be made when programming.
Pessimisation is when write code in a way that "prevents the compiler from doing a good job". A typical example would be to not place functions in a place that allows them to be inlined, when the function is REALLY small and simple (a {s,g}etter for example). This can make the function take 10x the time it should take, and it's such a simple thing to "get right".
A pessimisation that I've found a few times on this site is to use "a /= 2;" when "a >>= 1" is equally suitable. If we know that a
is not negative, then shift left and divide have the same effect, but even when the compiler is optimising the divide, it nearly always produces more code to cope with "it may be negative" situation - and that extra code can be a real performance hit in some cases.
Premature optimisation is when you unroll loops or otherwise make the code more complicated simply because you don't trust the compiler to do a good job - typically with no evidence that it won't do a good job.
Another example would be "not using std::vector
", but your own expandable array
because "vector is too slow", without even having tested the code using std::vector
.