Low-Tech Tools for the Thinking Tester

[article]
Summary:
Contrary to what some vendors may lead you to believe, a whole-team approach to quality doesn't require a lot of complex, integrated tools and services to achieve. Paul Carvalho explains how to perform good testing with readily available, low-tech tools. These tools are all either free or cheap.

Over the past two decades, I have seen many advances in technology and processes for software development. From the evolution of computers, storage, and programming languages, to new social, organizational and development models for businesses to consider. Of all these advances, I like how the Agile Manifesto returns the business focus to providing value quickly through human interaction and collaboration. Sometimes throwing more technology, processes, and tools at the problem won’t help you develop better solutions.

Quality requires a whole-team approach. Everyone tests, all the time. You may have some specialists on your team with testing expertise, but they aren’t the only ones allowed to ask good questions and provide actionable information on quality. Contrary to what some vendors may lead you to believe, a whole-team approach to quality doesn’t require a lot of complex, integrated tools and services to achieve. You can do some good testing with readily available, low-tech tools.

The best, most widely available, and disappointingly least-often-used tool in testing is, of course, your brain. If you aren’t thinking, you aren’t testing. If you aren’t testing, you aren’t paying attention to what you are developing. If you aren’t paying attention to what you are developing, you aren’t being agile, because you aren’t really sure of what value you are providing.

When you use your brain in testing, things like creativity, models, and deduction can help boost your effectiveness to a new level. You may discover that you don’t have to go high-tech to do a great job. Keeping in line with the agile spirit, my top three most commonly used low-tech tools in my agile and exploratory testing toolkit are whiteboards, Sneakernet, and paint applications.

Whiteboards
As a visual learner, I find that whiteboards to be invaluable tools. I use them to brainstorm, share information, track risks, and radiate project information.

I haven’t used a test plan document in almost a decade. People don’t read them, teams don’t use or maintain them, and you aren’t being agile if you use them to communicate important information. Note that I am not against capturing records or artifacts of the work performed on projects. Documents are records, not conversations. You don’t build relationships or develop an understanding between stakeholders or project team members when you use a document to relay important information.

By comparison, whiteboards let everyone on the team know what’s going on in the team members’ heads. Ideas, observations, explanatory diagrams, schedule milestones, progress charts, and more, are exposed and not buried in documents where they are less likely to be noticed, if at all. You never know when a piece of information will be critical to someone else’s understanding of the project or system. Put the information up on a board and leave it there for a while; anyone can come along and learn from or add to the information shown.

Whiteboards located in high-traffic areas, or where we gather for meetings, are prime candidates to use as information radiators. [1] There are many types of information radiators to help communicate different views of development progress. For testing purposes, I sometimes use a testing dashboard. [2] This allows you to organize descriptions of the testing status, effort, and quality in a table format so that all team members can quickly understand what’s going on at any given moment without having to ask questions. Depending on the project, I may customize the dashboard with additional or different information, or I may not use one at all. Whiteboards are where brainpowers meet team powers.

Sneakernet
When I train developers and testers in agile and exploratory testing practices, one of the first rules I tell them is “when in doubt, ask someone.” This involves getting up from your chair and physically walking over to the person who can help you solve the problem facing you. Using the sneakers on your feet to get answers in person can greatly reduce the amount of time required to understand something by avoiding the broken telephone game often played through email or some other electronic system.

When we test, we navigate a sea of doubt. Other team members are the beacons and buoys that help us find the correct path for the journey. Certainty is the fool’s path and there is much for us to be unsure of in any given project.

Some examples of opportunities to seek clarity from other team members include:

    • You are uncertain as to how a feature or story should work from the customer’s perspective.
    • You are uncertain as to what the expected system behavior should be while testing. Sometimes there may be competing ideas of what the “right” behavior should be, so ask someone before you jump to conclusions and report a bug.
    • You need help with a blocking issue that is preventing testing. For example, you might need to access a new report in a system and when you do so, you might discover that the designated users don’t have the security permissions to view it. If you have a short amount of time to test a particular feature, find someone to help you get up and running as quickly as possible.

To be respectful of the other person’s time (i.e. the person you wish to speak with), I recommend the following before travelling through Sneakernet:

    • When possible, repeat the observed problem before you ask someone for help. If you can’t reproduce a problem, is it still worth interrupting someone else about it?
    • Check oracles: Check references (such as requirements and specifications), logs (e.g. additional information to support your observations), and comparable functionality (within the system or from competitive products). You may also want to ask a team member near you first. Sometimes the answer is nearer than you think.
    • Consider alternatives: Use the Rule of Three to think of at least three possible causes or explanations for the unexpected system behavior or problem facing you.
    • Bring data: Collect whatever information you need to help explain the problem to the person you are interrupting.

In distributed teams, where Sneakernet is impractical or inefficient, I choose tools that allow for the most human interaction first. My first runner-up choice is to use a telephone (or Skype or similar application) to contact someone for help. If that doesn’t work, I will then try an instant messaging program of some sort. Good communication involves more than just words and we pick up more of the underlying meaning and context when we see and hear someone speak. For this reason, email is almost never an option for me, unless it is to schedule a meeting to discuss the problem in person.

The amount of time saved by focusing on direct human interaction and verbal communication is tremendous; time saved is money saved. I have seen failed communications and misunderstandings between development team members that have gone on for days and weeks through tools like email and bug tracking systems. I can’t begin to estimate the impact to the project and costs when things like this happen.

I am OK with using email or creating bug reports after a conversation has taken place to summarize the key points we’ve already discussed. However, as with test plans, you should never bury important, timely information in a tool or document repository. Email is where knowledge goes to die.

Simple Paint Applications
One of my favorite testing tools for reporting bugs is the default Paint or Paintbrush program that comes with any computer operating system. It’s a little high tech in that you need a computer, but it’s relatively low-tech when you consider feature set and complexity. I like these tools because they’re simple, readily available, and good enough to help you create a powerful message.

Years ago, I realized that I needed to capture more information in my bug reports to help explain the problem details. I tried writing more text, but that made the reports too wordy, making it hard to pick out the important parts. In a page full of text, how do you know what the key points are? How do you separate the signal from the noise? Then it dawned on me: “A picture is worth a thousand words.” If I include a good picture, I should be able to communicate more information than by written text alone.

With this in mind, my bug-reporting process now goes something like this:

1. Identify and repeat the problem in the system under test.
2. Take a screen capture. Use a mobile phone camera if you need to.
3. Open the computer’s default paint program.
4. Annotate the picture to identify things like:

a. Where in the app you are (e.g. highlight the page title, web URL, menu navigation, or breadcrumb trail).
b. What data is important to reproduce the issue, if visible on screen
c. The problem (put a circle or a box around it or highlight it with arrows, and add a line or two of text to clarify the problem or expected behavior).

5. Save the marked-up screen capture to an image file (e.g., PNG, GIF or JPG).
6. Attach the image to the bug report.

Sometimes I speak with another team member (via Sneakernet) before I log the bug in step six, and sometimes I don’t need to, depending on the complexity, amount of doubt, and potential impact of the issue found. Regardless of whether or not I log a bug, I have a habit of creating annotated screen captures when I encounter unexpected things. I find them helpful as reminders of what I saw, what I did, and what I thought at the time.

Annotated screen captures are useful for more than simple user-interface issues like spelling mistakes or element misalignments. A good picture may capture the essence of a problem and clarifies the meaning without adding a lot of text to a bug report. Of course, I will also attach log files and any other relevant information to the bug report as required.

Laziness alert! I have seen testers skip the annotation step and add unmarked screen captures to bug reports. This is a mistake. As a tester, you want to provide clear and actionable information. When you provide an unmarked screen capture, you raise the cognitive effort required to decipher the problem and thereby decrease the value of the image itself. It becomes a guessing game. I have seen developers repeatedly ignore these kinds of images and focus solely on the text descriptions to try to identify the problems.

If you want to increase the likelihood that someone will pay attention to your bug report and work on the issue, take a moment to annotate the image and help others see what you are thinking. No one has time for mind-reading in development.

Summary
Although I have identified low-tech tools as being testing tools, they also function as communication tools that help facilitate rapid learning between project team members. Whether you choose to display your thoughts openly, talk to team members and stakeholders in person, or use annotated screen captures. Talk with your team members about other ways you can share useful, timely information to help stakeholders make good decisions.

I have more low-tech tools in my toolkit to help me test quickly and efficiently, including things like text editors (for blink tests and capturing notes), spreadsheets (for analysis and presentation), and sticky notes (for risk assessment and strategy development). They are all readily available and cost little or nothing to acquire, although some skill and creativity is required to use them well. As long as you remember to use your most important tool, your brain, you may find simple solutions all around you. That sounds agile to me.

References

[1] Alistair Cockburn, “Information Radiator” http://alistair.cockburn.us/Information+radiator

[2] For more information on testing dashboards, see my blog post titled “Radiating Testing Information” http://swtester.blogspot.ca/2011/03/radiating-testing-information-part-1.html

About the author

CMCrossroads is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.