Does anyone ever use stopwatch benchmarking, or should a performance tool always be used? Are there any good free tools available for Java? What tools do you use?
I ran a program today that searched through and collected information from a bunch of dBase files, it took just over an hour to run. I took a look at the code, made an educated guess at what the bottleneck was, made a minor improvement to the algorithm, and re-ran the program, this time it completed in 2.5 minutes.
I didn't need any fancy profiling tools or benchmark suites to tell me the new version was a significant improvement. If I needed to further optimize the running time I probably would have done some more sophisticated analysis but this wasn't necessary. I find that this sort of "stopwatch benchmarking" is an acceptable solution in quite a number of cases and resorting to more advanced tools would actually be more time-consuming in these cases.
After all, it's probably the second most popular form of benchmarking, right after "no-watch benchmarking" - where we say "this activity seems slow, that one seems fast."
Usually what's most important to optimize is whatever interferes with the user experience - which is most often a function of how frequently you perform the action, and whatever else is going on at the same time. Other forms of benchmarking often just help zero in on these.
I don't think stopwatch benchmarking is too horrible, but if you can get onto a Solaris or OS X machine you should check out DTrace. I've used it to get some great information about timing in my applications.
A profiler gives you more detailed information, which can help to diagnose and fix performance problems.
In terms of actual measurement, stopwatch time is what users notice so if you want to validate that things are within acceptable limits, stopwatch time is fine.
When you want to actually fix problems, however, a profiler can be really helpful.
Stopwatch is actually the best benchmark!
The real end to end user response time is the time that actually matters.
It is not always possible to obtain this time using the available tools, for instance most testing tools do not include the time it takes for a browser to render a page so an overcomplex page with badly written css will show sub second response times to the testing tools, but, 5 seconds plus response time to the user.
The tools are great for automated testing, and for problem determinittion but dont lose sight of what you really want to measure.
You need to test a realistic number of iterations as you will get different answers depending on how you test the timing. If you only perform an operation once, it could be misleading to take the average of many iterations. If you want to know the time it takes after the JVM has warmed up you might run many (e.g. 10,000) iterations which are not included in the timings.
I also suggest you use System.nanoTime()
as it's much more accurate. If your test time is around 10 micro-seconds or less, you don't want to call this too often or it can change your result. (e.g. If I am testing for say 5 seconds and I want to know when this is up I only get the nanoTime every 1000 iterations, if I know an iteration is very quick)