Transitioning to Agile Testing

[article]
Summary:
Your developers are already working feature-by-feature in iterations, but your testers are stuck with manual tests. How do you make the leap to agile testing when the nature of agile's iterative releases challenges testers to test working segments of a product instead of the complete package? In this column, Johanna Rothman explains that the key challenge resides in bringing the whole team together to work towards the completion of an iteration. Only then will the testers—and the entire team—know how to transition to agile.

Some test teams may be stumped on how to transition to agile. If you're in such a team, you probably have manual tests for regression either because you never have had the time to automate them or because you are testing from the GUI and it doesn't make sense to automate them. You probably have great exploratory testers who can find problems inside complex applications, yet they tend not to automate their testing and need a final product before they start testing. You know how to plan the testing for a release, but now everything has to be done inside a two-, three-, or four-week iteration. How do you make it work? How do you keep up with development?

This is a common problem. In many organizations, developers think they have transitioned to agile while testers are still stuck in manual testing efforts and unable to "keep up" at the end of the iteration. When I explain to these people that they are receiving only partial benefit of their agile transition, the developers and testers both explain that the testers are just too slow.

The problem isn't that the testers are too slow but that the team does not own "done," and, until the team owns "done" and works together to achieve it, the testers will appear too slow.

Know What "Done" Means
Agile teams can release a working product every iteration. They may not have to release, but the software is supposed to be good enough to release. That means that testing—which is about managing risk—is complete. After all, how can you release if you don't know the risks of release?

Testing provides information about the product under test. The tests don't prove that the product is correct or that the developers are great or terrible, but rather that the product does or doesn't do what we thought it was supposed to do.

That means the tests have to match the product. If the product includes calls to another system, some set of tests have to call that other system. If the product includes a GUI, the tests—at some point—have to use the GUI. But, there are many ways to test inside a system. From under the GUI, the way is to build the tests as you proceed, so you don't need to test only end to end and you will still receive valuable information about the product under test.

If the developers are only testing from the unit-level perspective, they don't know if a feature is done. If the testers can't finish the testing from the system-level perspective, they don't know if a feature is done. If no one knows if a feature is done, how can you call it done for an iteration? You can't. That's why it's critical for the team to have a team-generated definition of done. Is a story done if the developers have tested it? Is a story done if the developers have integrated and built it into an executable? What about installation? How much testing does a feature need in order to know if it's done or not?

There is no one right answer for every team. Each team needs to look at its product, customers, and risks, and say, "OK, we can say it's done if: all the code is checked in, reviewed by someone, or written in a paired way; all the developer tests are done; and all the system tests have been created and run for this feature under the GUI. We'll address GUI-based checking every few days, but we won't test through the GUI."

I have no idea if that is a reasonable definition of done for your product. You need to assess the risks of not doing frequent GUI tests for your product. Or, maybe you don't have a GUI for your product, but you have a database. Do the developer tests need to access the database? Maybe, or maybe not. Do the system tests need to access the database? I would think so, but maybe you have a product I can't imagine, and maybe they don't need to all the time. Maybe you need more automated tests that test schema upgrades or transitions before anything else. "Done" depends on your product and its risks. Look at the risks of releasing the product without certain kinds of tests, and you'll see what you need in an iteration to get to releasable product.

Create a Just-Enough Test Framework
Once you know what you need in an iteration, you'll probably need testing. Then, you'll encounter the "Give a Mouse a Cookie" problem. In a delightful children's book of the same name, if you give a mouse a cookie, he wants a glass of milk to go with it. Then, he needs a napkin to wipe the milk off his mouth and a broom to clean the crumbs off the floor. The need for more and more continues until the mouse is tired and wants another cookie, which starts the whole cycle again.

This is what happens when a test group wants a "perfect" test framework for their product. It's a reasonable desire. Unfortunately, you can't always tell what the perfect framework is until the product is complete. If you wait until the product is complete, the testing is tacked onto the end of the project-that's too little, too late.

Instead of a perfect test framework, try developing a just-good-enough test framework for now, and plan to refactor it as you proceed. That provides the test team enough automation to get started and increased comfort with the automation as the iteration and project proceed. It doesn't lock you into a framework that no longer fits because you've spent so much money and time developing it.

Remember, testers are just like product users. Just as your product users can't always tell what they want or need until they see the product, the testers can't tell what test framework they want or need until they start using one.

Everyone Works on a Story Until the Entire Story is Done
If you need to know what "done" means and you create a just-good-enough framework for testing, how does the test team "keep up" with development? By making sure the entire team works on a story until the story is done.

Say you have a story that requires three developers and one tester. The developers work together to create the feature. At the same time, the tester refines the tests and creates enough automation or installs the test into the existing automation framework. But, what happens if you are transitioning to agile and have no framework? Then, one (or more) of the developers works with the tester to install a reasonable framework and add the tests for this feature to that framework.

There is no rule that says developers can't help testers install test frameworks, write test frameworks, or even write tests to help a story get to done. Since you have a team definition of "done," doesn't it make sense that the team members help each other get to done?

Transitioning Is a Team Activity
Once you know what "done" means for a story, and the whole team is committed to getting a story to done, now you can create an environment in which the test team can transition to agile. As long as the developers assist with testing frameworks, the business analysts assist with refining the story, and the testers assist with providing information about the product under test, the cross-functional project team can transition to agile.

Transitioning to agile is a cross-functional project team activity, not something just the developers do. If you have testers who can't "keep up," it's not the test team's problem. It's the project team's problem. Fix it with the project team. Even if you start with just-barely-good-enough test frameworks, you can refactor them into something wonderful. You'll find that testers will keep up with the developers.

About the author

CMCrossroads is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.