Performance testing best practices when doing TDD?

后端 未结 9 982
轻奢々
轻奢々 2021-02-04 12:30

I\'m working on a project which is in serious need of some performance tuning.

How do I write a test that fails if my optimizations do not in improve the speed of the pr

9条回答
  •  死守一世寂寞
    2021-02-04 13:33

    In many server applications (might not be your case) performance problem manifest only under concurrent access and under load. Measuring absolute time a routine executes and trying to improve it is therefore not very helpful. There are problems with this method even in single-threaded applications. Measuring absolute routine time relies on the clock the platform is providing, and these are not always very precise; you better rely on average time a routine takes.

    My advice is:

    • Use profiling to identify routines that execute the most times and take most time.
    • Use tool like JMeter or Grinder to elaborate representative test cases, simulate concurrent access, put your application under stress and measure (more importantly) throughput and average response time. This will give you a better idea of how your application is behaving as seen from the outside perspective.

    While you could use unit tests to establish some non functional aspects of your application, I think that the approach given above will give better results during optimization process. When placing time-related assertions in your unit tests you will have to choose some very approximative values: time can vary depending on the environment you are using to run your unit tests. You don't want tests to fail only because some of your colleagues are using inferior hardware.

    Tuning is all about finding right things to tune. You already have a functioning code, so placing performance related assertions a posteriori and without establishing critical sections of code might lead you to waste a lot of time on optimizing non-essential pieces of your application.

提交回复
热议问题