Date Archives

April 2018

We Built the Wrong Thing – From Ambiguity to Stability

Let’s set the scene. You’re out to lunch with your team celebrating a successful launch of a new feature. Your product owner interruptions the conversation to relay an email from a disappointed stakeholder.

From: Stakeholder, Mary (mary.stakeholder@nickkorbel.com)
Sent: Thursday, April 19, 2018 11:51 AM
To: Owner, Product (product.owner@nickkorbel.com)
Subject: Can we talk?

Thank you for all of your work, but this doesn’t do what I thought it would. Can we talk?

– Mary

The discussion around the table quietly shifts to how nobody ever knows what they want. “We followed all the agile best practices”, a senior developer frustratedly quips, “How did we build the wrong thing?”

What went wrong?

When you get back to the office, you huddle up with Mary and pull up the acceptance criteria.

Story: Adding events to a calendar
As a user
I want to enter events into a calendar
So that everyone knows when people are available
Scenarios:
Given I've entered an event into the calendar
When I view the calendar
Then I can see that event

“This is what you asked for, right?”

Mary replies, “Yes – but it’s not what I wanted.”

“What do you mean?”

“Look at this. I want to set up a 3 day training session, but I only have one date picker. And every new event is the same color, so it’s really hard to see who is booked when. And I have no way to know when a new event is created. And…”

“Oh.” you interrupt. “We didn’t know you wanted that. You had all of those meetings with our PO. Why didn’t you ask?”

Mary, now frustrated with the amount of time seemingly wasted, responds “I thought we were all on the same page!”

Specification by Example

Is this a familiar story? Even using the de-facto acceptance criteria format so popular in agile, it’s very easy to build ambiguous expectations. Ambiguity leads to disappointed customers and frustrated developers.

Years ago, I read Gojko Adzic’s Specification by Example and it changed the way I view user stories. I cannot possibly do justice to all of the incredible advice and ideas from the book in the blog post, but I’ll try to summarize.

Instead of a PO or BA working with customers to capture the stories and later reviewing those stories with developers, Gojko recommends running specification workshops. We follow a simple workflow for this:

Derive scope from goals > Specify collaboratively > Illustrate requirements using examples > Refine specifications > Frequently validate the application against the specifications

Deriving scope from goals is probably the biggest change a team will need to make. Instead of being presented a set of acceptance criteria, the team is presented with a goal. For example, stating a goal to know people’s availability instead of the scope to build a calendar.

Working with the stakeholders, the team collaboratively identifies the acceptance criteria. Maybe a calendar is what is built. Maybe it’s a simple list. Maybe it’s a search. The point is that we start with the goal in mind, and collectively identify the scope. This eliminates the translation layer between stakeholder to product owner to development team.

Ambiguity— —

The next couple steps are iterative. We extract real-world examples from the scenarios, and illustrate the acceptance criteria using those examples.

Instead of

Given I've picked a date
When I book that date
Then that date is booked

We have something like

Given Mary has selected 10:00 am on April 18th, 2018
When she completes the booking
Then the calendar indicates that Mary is unavailable on April 18th 2018 between 10:00 am and 10:30 am

It’s only a slight change, but it has massive effects. Using real examples leads to real questions. What if Mary is already busy at that time? What kind of indication should we show? Is the default event length 30 minutes? Can that be changed?

Ambiguity— —

And here’s where it gets fun

Most teams write automated end-to-end tests for their applications, but a lot of the time these tests are defined and written after the functionality is built. We end up simply validating that what we built works how we built it. Even if the tests are built based on more traditional acceptance criteria, the person writing the test has to make some assumptions about how to make the application behave in the way that meets the criteria.

If we have a Cucumber feature file that looks like this:

Story: Adding events to a calendar
As a user
I want to enter events into a calendar
So that everyone knows when people are available
Scenarios:
Given I've entered an event into the calendar
When I view the calendar
Then I can see that event

The person implementing the tests has no choice but to make up some dates to pick and the validation will likely be something generic.

When writing automated acceptance tests based on real-world examples, the tests can match the acceptance criteria 1:1. Not only does this enhance the clarity of how to test the application, it also brings gaps in the shared understanding of a story to light early.

Story: Adding events to a calendar
As an event organizer
I want to be able to indicate any events I'm participating in
So that everyone knows when I am available
Scenarios:
Given Mary has selected 10:00 am on April 18th, 2018
When she completes the booking
Then the calendar indicates that Mary is unavailable on April 18th 2018 between 10:00 am and 10:30 am

Ambiguity— —

Automating the Acceptance Criteria

One common frustration of test automation is maintenance and fragility. Features change and evolve over time. When tests are driven from an interpretation of the specifications rather than the the actual specifications, maintenance becomes a challenge. It’s difficult to trace a specification change to an associated test (or set of tests). So minor changes in specifications tend to have major impacts to tests.

If the specifications are automated, instead of translated into automated tests, you know exactly what test is affected. In changing the specification, you are forced to change the test and underlying code. You can make micro changes and receive instant feedback that the application still works.

Stability++

No silver bullets

This isn’t an overnight change. Like most things, it takes deliberate practice. Practice facilitating specification discussions with non-technical people. Practice with finding the right type and number of examples.

The return on this investment can be huge. Specification workshops often lead to significant reduction in scope because technical people and business people are speaking the same language and understand the problem in the same way.

The resulting specifications are free of ambiguity, so everyone has a shared understanding of the exact behaviors they should expect from the application. Validating the application against the specifications in an automated way ensures the application is always working the way everyone understands and expects.

Eliminating the specification ambiguity builds a shared understanding between everyone involved, which leads to long term application stability. And that’s good for everyone.

Have you tried this?

I’m interested in hearing from readers about their experiences. Have you tried this or something similar? How did it go?