A Game Plan for Rapid Test Planning

[article]
Summary:

Rapid test planning is a process of cultivating test ideas within minutes, rather than days. It's opposed to rigorous test planning, which may strive to document every test possibility. The mission is to get testing on the road now, finding critical things to test, and getting important information to stakeholders quickly. In this article, Jon Bach explains how easy it can be to tackle a rapid test plan once you've got a strategy in mind.

Management almost always wants test results quickly, even when we haven't seen the product yet. When faced with a time crunch, we can freeze, get angry, or see what we can do.

I've experienced paralysis and anger many times in my twelve years of testing, but the longer I stay in this business, the more I find myself "seeing what I can do." These days, I consider it a challenge when someone says "you have three days to test."

But I have to be careful of rushing right to the keyboard. There are all kinds of assumptions that I'll need to check, otherwise, I'll make a lot of progress in the wrong direction. Enter rapid test planning. For me, it involves a few exploratory testing "sessions" that result in lists of test ideas, issues (or questions), and risks. Exploratory testing is test design and execution happening at the same time-sometimes in the same second-with an emphasis on learning about the product and the project. An exploratory session could last thirty minutes to three hours. But the mission of the first session is usually, what can I test right now?

In my work for a Seattle software testing lab, clients usually don't give us a lot of testing time (because they often don't have a lot of it to give), so our promise to them is to focus more on delivering information than creating written test cases that we plan to run later. That's why I often use rapid test planning to frame exploratory testing.

Think of an American-style football game. You have four quarters, or roughly one hour of clock time, to beat the other team by designing plays that advance the ball down one hundred yards of turf while the other team tries to stop you. Plays can involve throwing or running the ball until you reach the end of the field for a touchdown. When the other team has the ball, it's your turn to stop them from scoring. Each team creates plans for offense and defense, but when the game starts, those plans might change based on how the plays are going. That's exploratory testing in action.

But let's say my football team is the thing we're testing. In other words, how do I know if my team is any good? If I'm a coach being interviewed, I may be put on the spot to tell them something valuable. The sports reporter might ask a question designed to catch me off guard to get a good sound bite for his story:

"If you were playing the Seahawks today, what would be your game plan?"

Wanting to provide a great quote, but thinking quickly, I might answer in terms of the following five elements:

  • Structure: The team's composition
  • Function: The team's capabilities
  • Data: What the team might "process" (i.e., what we do with the ball)
  • Platform: What the team depends upon
  • Operations: How I'll use my team's skills

Using a mnemonic like Structure/Function/Data/Platform/Operations (SFDPO) is considered a heuristic-a fallible method for solving a problem. It's not meant to be exhaustive or foolproof, just enough to start you quickly on the road to a useful solution.

So if, an hour before a spontaneous project meeting, a project stakeholder asks you to be prepared to answer questions about your plan to find bugs, you might take the same approach:

  • Structure: What is the software project comprised of? (Code, files, specs, wireframes, user docs, media, etc.)
  • Function: What is it that the software can do? What are its features and subfeatures?
  • Data: What kinds of data can the software process and how might it process it?
  • Platform: What does the software depend upon? (Plug-ins, operating systems and related service packs, language and locales, etc.)
  • Operations: What are the different ways the software's features can be used? What are the patterns and sequences of input, who is likely to use it?

Let me bring this scenario back to the coach in my football example.

I may answer the sports reporter's question like this:

"I've got a deep bench (Structure), so I'm confident that our past performance on special teams against the Seahawks has been great (Function) when we pass the ball instead of run it (Data). Given my star players aren't that healthy right now (Platform), I'd start them in the third quarter, especially if that wind picks up out there (Operations)."

I know it's a bit abstract, but this sound bite might give you an idea of how a coach might inventory their ideas about five major team dynamics on-the-fly. As coach, I might be wrong about this plan since the sound bite may be more of a gut check, but it's meaningful enough for now to solve the problem of the sound bite. Besides, I can always adapt the plan once the game starts and I get more information.

Execution

Here's how I do it:

  1. The gauntlet is thrown down: "Test this! You have three days."
  2. I ask questions to get more context for my mission or information to enhance my plan. (What does "test" mean? Is this a new version? Has it been tested before? Three days until what? Am I the only one on the project?)
  3. I give myself a mission for my first of many exploratory sessions. Using the heuristic SFDPO, I'll review anything structural that I can get my hands on:
    • the software
    • specs or product docs
    • project docs
    • marketing literature
    • names of knowledgeable project staff who can help me later
    • a projector and a PC with a Web connection (to do on-the-fly research)
  4. I keep track of my notes, bugs I happen to stumble across, and any issues I want to escalate.
  5. When I think I've got enough to go on, I declare an end to the session and go over my plan with a few stakeholders.
  6. I go back to step 2 with a new mission for the next session to frame my planning-perhaps taking a Function or a Platform perspective this time.
  7. Then after the first three or four sessions (by the end of Day One), I'll likely have a good idea of what's there to test, some issues to get closure on, and a reasonably good test plan covering the five SFDPO elements, designed to expose important bugs in the next two days.

A measure of success with rapid test planning is if after your plan is reviewed by a few stakeholders, they help you decide that it has sufficient benefits, no critical problems, and that further planning is a waste of time and resources. It may be "good enough for now" to get you started. Good enough in this context does not mean mediocre-it means "not providing more quality than is necessary."

Sometimes striving for too much quality is just as risky as striving for too little. The aim here is to get you down that field, gaining yardage on that first play.

Further Rapid Test Planning Resources

  1. James Bach at his testing consulting company Satisfice, most famous for offering the Rapid Software Testing course.
  2. Michael Bolton, principal consultant at DevelopSense, who also gives the Rapid Software Testing course.
  3. Robert Sabourin, principal consultant at Amibug.
  4. James Lyndsay, principal consultant at Workroom Productions.
  5. More rapid test planning heuristics like SFDPO.

About the author

CMCrossroads is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.