Starting with the current release cycle I have introduced automated performance testing for my project team. Not that we didn't test performance in the past, not that we didn't assess performance related items during a release cycle. But the fact that it is now becoming part of the automated development environment creates a number of interesting collateral effects. I'd like to highlight a few.
Reduced Latency
First of all there is the fact of performance testing itself. I strongly recommend not to wait until you are in the last few weeks or days before a release. That may be too late. If you discover a performance related issue then you may be forced to take shortcuts so your product meets performance requirements. And indeed in the past we already did assess the performance of the product throughout a release cycle.
What is different then? In the past we had to get performance engineers to set up the test, maintain test scripts, run the tests, analyze the results, identify root causes, and suggest solutions. Now the tests are written by developers and they are integrated into the test suite and then executed immediately after integration. No need to wait until a time slot with a performance engineer becomes available. The tests are now written and maintained by the developers. The performance engineer's time is freed up and in that time they are available to consult to the developers. The feedback loop is much shorter. A few hours instead of a few days or weeks. And this is also helps reducing risk since stories that may have a performance impact can be played earlier.
General Benefits of Automation
And you also get the obvious benefits of automation such as repeatability, lower costs, higher quality. Not that the performance engineers make mistakes consciously. As humans we are not fail safe whether we like it or not. So the major drivers for the automation were: cost, time, quality.
Impact on Behavior
But then there are less obvious effects caused be the introduction of automated performance testing. For instance I am observing that the thinking of the entire team is influenced by it. The performance aspect has been promoted and is not playing a much more important role in the considerations of the cross functional teams. Performance is built in instead of bolted on. Should a particular design or implementation cause a performance issue it can be dealt with immediately. Bad practices are no longer proliferated.
With our customers I am also observing an improved understanding of considering non-functional aspects such as performance engineering when deciding about backlog priorities.
And although it is not the main driver, and I didn't think of this aspect at all, there is also the impact it has on the developer. I sense that improving the development environment with automated testing has a positive impact on the morale of the engineering team. We continue to improving the development environment providing all people an opportunity to improve velocity while improving quality at the same time.
"Healthy Noise"
Certainly this change wasn't painless. The development teams had to negotiate with their customers the right amount of performance related work that needed to be accommodated in the backlogs. The automated build environment has to be extended.
We may have to purchase additional licenses for the performance toolset. I hoped I could get away with just the floating licenses we already had but that doesn't seem to pan out! However, now the bottle neck is no longer the performance engineer. Now it looks more like the number of licenses. When I compare the "price" of an engineer with the price of an additional license, it becomes apparent that the license is definitely cheaper.
Introducing automated performance testing caused some issues. But I would call this "healthy noise". All participants - customers, user experience experts, performance engineers, developers - are working very focused to iron out these hick-ups and they have made a lot of progress.
Wrapping Up
Introducing a somewhat significant change like this requires adaptation by everybody. Processes, tools, etc. need to be adapted as well. The result, however, is that you are moving closer to a holistic approach to software engineering that also considers performance engineering and testing as an integral part of the process. In particular I am very pleased how all the people involved grew as part of this process. The team and the product will be better because of this. Well done, folks!
Saturday, May 10, 2008
"Healthy Noise": Introducing Automated Performance Testing
Labels:
automated build,
change,
customer,
performance testing,
productivity,
quality,
velocity
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment