Developing software correctly is a detail-oriented business. George Dinwiddie writes on how using the Three Amigos strategy can help you develop great user stories. Remember, the goal is to have the work done just in time for planning and development. It should be complete enough to avoid stoppages to build more understanding, but not so far in advance that the details get stale.
I remember a NASA project I worked on as a programmer. I was an employee for a subcontractor working on a subsystem of a satellite science system. For this small slice of the mission, I studied voluminous written requirements documentation. Frequently, I found items I didn’t understand. I would ask the local project manager and colleagues who’d worked on numerous NASA projects. Many times they could infer what the requirements probably meant. Sometimes the core of a requirement was too garbled to guess. I remember one in particular where there was text that duplicated, word for word, some text from the previous requirement in the document. The problem was that this text didn’t relate to the title of the requirement and didn’t even form a complete sentence in the context. When such problems came up, we would schedule a meeting with the civil servant who’d written the document. In a few days, we’d meet and talk and straighten out the misunderstanding.
These conversations allowed me to continue working. They did not, however, necessarily result in making the requirements document easier to understand. The most obvious problems, such as the copy-and-paste error, were edited to look more reasonable, but that didn’t mean a reader would necessarily understand the intent. This hit home the day a tester working for another subcontractor called me up to ask how my code was supposed to work. I explained to the best of my ability. As I did so, I worried that I might be wrong. Maybe I was only explaining how it did work.
Agile software development has not completely solved this problem. Many times, in the middle of developing a user story, the programmer discovers a question about how it's intended to work. Or the tester, when looking at the functionality that's been developed, questions if it's really supposed to work that way. I once worked with a team who too often found that when the programmer picked up the card, there were questions that hadn't been thought out. The team created a new column on the sprint board, "Needs Analysis," to the left of "Ready for Development," for these cards that had been planned without being well understood.
It was that problem the Three Amigos meeting was invented to address. Rather than wait until a story was in the sprint to fully understand it, members of the team took time to discuss stories that were on deck for the next sprint. The product owner or an analyst, as representative of the business, would ask if a programmer and a tester had time to look at a story with him. The three of them made sure all their questions were answered before considering the story ready for development. Sometimes an analyst had to go back to the business stakeholders for more detail. Sure, the team still raised an occasional question, but the epidemic had been stemmed.
These three people roles were selected as the Three Amigos because they provide mutually orthogonal viewpoints on almost anything you may be building. The business representative thinks about what the business hopes to accomplish by building it. The programmer thinks about the details needed to implement it, such as what information is needed when and what technologies can accomplish the goals. The tester thinks about what might go wrong, either within the system or in the external context upon which it depends. Sometimes these three viewpoints are enough, but sometimes others are also needed. I’ve seen the beneficial results when a user experience expert joins the discussion to advocate for the customer’s interactions with the system. A security expert might be present to notice holes where a malevolent intruder could get the system to do undesirable things. The name Three Amigos was not intended to limit the discussion to three people, but to encourage at least three different viewpoints.
These concerns affect each other. A security concern may trigger a change in technology. A change in technology may offer different options for the user experience. The technology change may also alter the ways in which external events could cause problems for the system. Changes in the user experience certainly adjust the ways in which user input might trigger errors in the implementation. Throughout all these changes, the business goals need to be kept in mind, but it’s also valuable to keep an eye open for even more extensive benefits that the business might not have thought feasible.
The fact that the desires for the system have been looked at from so many different viewpoints provides an edge in terms of readiness for development. Looking at them simultaneously and discussing tradeoffs from the different viewpoints provides further benefits, in terms of both readiness and making better choices.
Despite all the discussion, it’s possible the group still has a fuzzy understanding of what is to be built. They may not be sure which of several alternatives they should choose. Worse, they may have different ideas about which alternative was chosen. Even if it seems clear at the moment of discussion, that clarity might fade over time—even a short amount of time.
How could we be more sure that we’re saying the same things? How could we record our decisions in a way that we won’t interpret differently later, or that other people will understand without repeating the entire discussion?
I find that examples help people communicate abstract ideas. It’s easier to know what someone intends if they can describe an example that illustrates it. I look for the list of essential examples or acceptance scenarios that the completed story will satisfy as a key output from the Three Amigos discussion. We can each know if one of our concerns has not been heard if it’s left out of the examples. We can later remember subtleties by including an example showing how that case is different from similar ones. Example scenarios provide a crispness to the understanding of the story that's hard to achieve any other way.
There are fringe benefits to going to this level of detail. When the team is planning their work, they don't need to spend a lot of time understanding what the story means. These discussions don't go round-and-round finding the boundaries of the story. If the scenario isn't listed, it's part of another story (or it's been overlooked). In fact, dividing the scenarios into groups is a simple way to split a story into smaller ones.
Another benefit is that the scenarios can be automated as acceptance tests prior to the development of the functionality. Having a clear picture of the outcome before starting helps keep the development on track, minimizing premature speculation of future needs and maximizing attention to current edge cases that might otherwise be overlooked.
In a development process that uses sprints or timeboxes, you've got the whole sprint to get the next sprint's worth of stories refined prior to planning. If you're practicing a single-piece pull process, you've got the length of time a story spends in the development input queue to do so. Either way, refining the backlog is a necessary overhead activity that should be done a little at a time, all the time.
The goal is to have the work done just in time for planning and development. It should be complete enough to avoid stoppages to build more understanding, but not so far in advance that the details get stale. We want our scenarios to take advantage of the most knowledge we can bring to bear. If done too early, we may have to revisit the scenarios to see if we need to alter them according to what we've learned since we created them.
More often than creating too many acceptance scenarios too early, I find teams spending this effort too late. It seems a lot of work to go to such detail when we know we've got a mountain of work to accomplish.
Developing software correctly is a detail-oriented business. We're going to have to get to that detail sooner or later. Leaving it until the programmer has a question causes interruptions in development, delays, lost effort, and, sometimes, unwarranted assumptions that give us results we don't want. Don't look at the mountain of work. Look at the little bit of work we've decided to tackle next. Do a great job on that, and things will go more smoothly.
Terrific article. I think the ideas apply to any methodology; even non-agile shops. In particular, giving examples as an output to the 3 amigos conversation can be used anywhere. Thanks for sharing.
Thanks, Mara. You're absolutely right.
George, enjoyed the article. The PO, BA and myself (QA) saw this as a problem. We run 2 week sprint cycles and now mid second week run a session where we walk through next sprint stories and develop acceptance criteria. Some interesting questions arise out these sessions and get dealt with before they become inefficiencies or blockers. Development seems a lot more efficient and I get a much clearer view of how testing should look and where it should focus (both he main and peripheral impacts). I'd really recommend anyone not utilising sessions such as this to give it a shot.
Great to hear, Paul. I notice you're not including the programmer mindset. Do you ever run into situations where something isn't feasible? Or it harder to do what's specified than it is to do something else that might be better? Are the programmers ever slow to get up to speed on the stories? Inquiring minds want to know! ;-)
Thought you might ask. We phase that separately. Our Devs want to go straight to solution so we focus on the user side first then involve our team Devs. I know it seems like an extra step but we have found it works well. That's not say that on the odd occassion a Dev hasn't raised a curly one. I think we have also gotten pretty good at recognising the stories that are likely to go down this path. For these we do involve a Dev earlier. I wouldn't neccessarily tell people to follow the process as we have implemented. We know it works for us, for this project, or at least for the sprints to date. I don't believe in "best practice" as a whole sale principle (not saying here that you are branding this best practice - just realised my statement sounded more forthright than it is meant to be). I believe in finding what practice works best for the group so would never expect people to not challenge an approach I'm involved in. I will bring this up in our next meeting though. Maybe it is time for revalidation of the approach.
Paul, I quite agree with you about "best practices." That's why I was interested in the results you were getting with the way you were doing it. And it sounds to me like you're paying attention to those results, and making decisions accordingly. That's excellent!
And reevaluating from time to time? That's golden! Even what's "best" in a particular situation will change over time.
One of the problems with writing an article like this is that I've got to describe a fairly generic situation. That can make it sound like there's "one true way" to do it. As you note, there's danger in telling people "to follow the process as we have implemented."