Premature optimization and Premature pessimization related to C++ coding standards

前端 未结 5 1753
夕颜
夕颜 2021-01-12 04:12

Herb Sutter\'s C++ coding standards says to avoid Premature optimization and Premature pessimization. But I feel both is doing the same th

相关标签:
5条回答
  • 2021-01-12 04:41

    What Herb means is that when you are faced with two equally readable options, always choose the most efficient one.

    Using std::vector::reserve() or the best standard container or algorithm is not premature optimization. However, not using them would be premature pessimisation.

    Premature optimization is when you sacrifice readability for the sake of some "optimization" that might even not be worth it. Use a profiler for that.

    0 讨论(0)
  • 2021-01-12 04:51

    I'd tend to think that premature pessimization is simply the misinterpretation of performance requirements that leads to premature opimization. i.e. you incorrectly assume your code will not perform fast enough or use too many resources (pessimism) so you optimize where it is not necessary.

    With the advent of more and many huge datasets, I tend to see to reverse more often, i.e. lack of sufficient pessimism leading to selection of algorithms that will not scale to meet user requirements. This is often coupled with the belief that compiler optimization is some kind of substitute for poor algorithm selection.

    0 讨论(0)
  • 2021-01-12 05:00

    What he means by premature pessimisation, I think, is just the opposite of premature optimisation: a fundamental disregard of which data structures and algorithms to use.

    Premature optimisation is often concerned with minute details of algorithms that can well be tweaked later and don’t need to be paid attention to at the beginning.

    Premature pessimisation, by contrast, concerns the high-level design of code architecture: a fundamentally inefficient interface for your library for instance cannot be fixed later by optimising, since the public interface is pretty much cast in stone.

    0 讨论(0)
  • 2021-01-12 05:02

    Defining pass-by-value parameters when pass-by-reference is appropriate

    is one of the simplest examples of avoiding premature pessimization. It costs nothing and just becomes second nature, and can save you some performance pitfalls.

    Assuming you are referring to this book - C++ Coding Standards: 101 Rules, Guidelines, and Best Practices. October 2004 ISBN: 0321113586 - items 9 and 25 give a few examples:


    1. Don’t pessimize prematurely

    1. Take parameters appropriately by value, (smart) pointer, or reference

    0 讨论(0)
  • 2021-01-12 05:06

    There are both small and large scale choices to be made when programming.

    Pessimisation is when write code in a way that "prevents the compiler from doing a good job". A typical example would be to not place functions in a place that allows them to be inlined, when the function is REALLY small and simple (a {s,g}etter for example). This can make the function take 10x the time it should take, and it's such a simple thing to "get right".

    A pessimisation that I've found a few times on this site is to use "a /= 2;" when "a >>= 1" is equally suitable. If we know that a is not negative, then shift left and divide have the same effect, but even when the compiler is optimising the divide, it nearly always produces more code to cope with "it may be negative" situation - and that extra code can be a real performance hit in some cases.

    Premature optimisation is when you unroll loops or otherwise make the code more complicated simply because you don't trust the compiler to do a good job - typically with no evidence that it won't do a good job.

    Another example would be "not using std::vector", but your own expandable array because "vector is too slow", without even having tested the code using std::vector.

    0 讨论(0)
提交回复
热议问题