The Number of Touches:
Improving Software Quality by Actually Testing

The Number of Touches: Improving Software Quality by Actually Testing

During (North American) football games, television analysts sometimes say that a football running back improves with the number of times they touch the ball—that is, during the course of the game, the more the running back gets the chance to advance the ball, the more he will advance the ball each time he gets it. They attribute this to getting the feel of the contact of the defensive players trying to tackle him or to the player learning the defensive players’ plans and tendencies to know which way to turn when running. Although the football player has spent the week or two before the game itself attending meetings, planning the game, and watching film on his performance and on his opponents, this type of running back develops a feel for the game in the game itself.

In the software testing world, we can often find ourselves a slave to the Agile software development process and the drive to “move quality assurance to the left.” I don’t want to dismiss the value of inserting quality assurance into design, requirements gathering, story elaboration, and other places where QA’s insight can avert some later crises or identify simple misconceptions and unshared assumptions. We get value from asking the dumb questions in the story elaboration meetings, certainly. Football teams spend weeks preparing and practicing for the football game—read Vincent Lombardi’s Run to Daylight to get a sense of how much preparation goes into one game—but at the end of the week, you still have to play the game.

If your quality process involves your test professional spending most of his or her time on ‘research’ that involves attending meetings for most of the day with business analysts and developers and sometimes with customer, client, or business stakeholders (who might not be the end users of the software) to artisanly craft a set of test cases to run once, you might develop well planned software full of bugs. Your quality assurance efforts before the code is written can minimize defects, but it won’t eliminate them. Fire-and-forget, one-and-done test cases probably won’t find them, either.

Helmuth von Moltke the Elder said, “No plan of operations extends with certainty beyond the first encounter with the enemy's main strength,” first only because he was born before Vince Lombardi who would have said it better. We have all heard it as the aphorism no plan survives the first contact with the enemy. Mike Tyson said, “Everybody has a plan until they get punched in the mouth,” but I’m going with a football metaphor here. The same goes for your test plans, but if you insist that your quality assurance professionals stick with the program and only test what’s planned, you’re only going to get what you programmed.

Your testers’ plans and understanding of the application will change, sometimes radically, between the black and white text of the requirements and the colorful-but-Section-508-compliant Web site. They will think of new possible workflows or actions to try out that they would not have gleaned from mere bullet points in Agile Lifecycle Management tools or requirements documents. One or more of them might feel a bit chagrined to not have thought of trying to add a regular user with the same email address as an administrative user in the planning meetings—but they might want to try it now. Like running backs who pick up on physical cues and use their instincts and experience to better advance the football, your testers will use their experience and insight to test new things on the fly based on what they see not on the field but in the fields.

Like a good coach, you can make your plans, and you can coach your team members, but you have to ultimately let them play the game. Plan for and allow your testers to conduct exploratory, clicking-around sorts of tests on the application in addition to the test cases. I call this just messing around, and I used that phrase to explain how I found a particular defect outside the anticipated happy test plan. Even now, when I run the smoke tests with every build or complete (manual) regression tests, I go off or beyond the script, doing things in a different order or with slightly different settings to see what shakes out. Unfortunately, sometimes something does. Some sequence of clicks, states, and saves leads to something unexpected. 

“Luck is what happens when preparation meets opportunity,” Vince Lombardi once said. “Bugs are what happens when planning meets testing,” I say. Give your testers the chance, and the time, to test your software, and they will find bugs before your users do. Otherwise, you can play the role of the losing coach in the press conference after the game, explaining how your test plans and documentation alone did not bring you success.

To view or add a comment, sign in

More articles by Brian J. Noggle

Explore content categories