Exploration Through Example

Example-driven development, Agile testing, context-driven testing, Agile programming, Ruby, and other things of interest to Brian Marick
191.8 167.2 186.2 183.6 184.0 183.2 184.6

Tue, 11 Oct 2005

Hoist by my own petard

I started my PNSQC talk by asking for three volunteers. I handed each a Snickers bar and told them to eat it. After they did, I asked whether they were confident their body would be successful at converting that food into glucose and replenished fat cells. Then I gave them part of the CSS specification. I asked them whether they thought they could be successful at converting that information into a conformant implementation. Unsurprisingly, they thought digestion would work and programming wouldn't. How odd, I said, that digestion and absorption works so much better than the simpler process of programming.

The idea here was to set the stage for an attack on the idea that (1) we can adequately represent the world with words or concepts, and (2) we can transmit understanding by encoding it into words, shooting it over a conduit to another person, and having them decode it into the same understanding.

Things did not go exactly as planned. After I gave them the Snickers bars, I was surprised when they balked and asked me all kinds of questions about eating it. I thought they were deliberately giving me a hard time, but one of them (Jonathan Bach) later told me that he was honestly confused. He said something like, "it would have been much clearer if you'd shown us what you wanted by eating one yourself."

... if I hadn't tried to transmit understanding down the conduit...

... if I'd explained ambiguous words with an example. In a talk about the importance of explaining with examples.

I'm glad Jonathan was clever enough to catch that, because the irony of it all would have forever escaped me.

P.S. It now occurs to me that another problem was that they didn't know why they were to do it. That's something I also covered in the talk: "justify the rules" from the list of tactics. I don't mind not telling them why, since telling them would have spoiled the effect, but not using an example just makes me slap my head.

## Posted at 07:39 in category /agile [permalink] [top]

Communication between business and code

In a few hours, I'll be giving a presentation at PNSQC. It's on communication between the business experts and the development team. After some audience participation involving Snickers® bars, trapezes, and Silly Putty® (actually, only Snickers bars) and some airy-fairy theorizing, I get down to discussion of 16 tactics. Here they are.

When it comes to teaching programmers and testers about a domain, examples matter more than requirements. It's fine to have statements like "always fill stalls up starting with the lowest number, except for the sand stalls and the bull stall". But when it comes time to be precise about what that means, use examples (aka tests). I think of requirements statements as commentaries on, or annotations of, examples.

It's best to build examples with talk around a whiteboard. There, a business expert explains an example to a small audience of programmers, testers, technical writers, etc. The conversation includes these elements:

  • People should ask questions about details. If the business expert casually says, "So we have a cow in stall 1", ask why it's in stall 1. The answer might be, "well, actually, it probably wouldn't be in stall 1, because that's the bull stall" - which now alerts everyone that there are rules surrounding which animals go in what stalls. Those rules might not matter soon, but it doesn't hurt to be aware of them.

  • Turn stories into rules. If the business expert says things like "well, since the bull stall is reserved for dangerous animals, we'd put an ordinary case in the next available stall," you have a rule that stalls are allocated into increasing order. That rule is something that will probably be found, in some form, in the code.

  • Still, favor complete examples over complete rules. The rules don't have to be precise; they're mainly a reminder to write precise examples. Expect the real heavy lifting of creating rules to be part of the programming process; the programmers will discover rules that cover the examples. (See my old favorite, the Advancer story.)

    Nevertheless, some early attention to rules helps shift the emphasis from procedural examples to declarative examples and from UI gestures to business logic.

  • Participants whould ask the business expert to justify the rules. Why is it that stalls are allocated in increasing order? There might be no particular reason, but it might be that stalls are numbered counterclockwise, so by housing cases in numerical order, a student working on her cases in stall order would walk directly from case to case instead of having to plan a route.

    What's happening here is that the development team is learning facts about the domain. Any set of requirements, examples, or other kinds of instructions to the team will leave them underconstrained. At some point, they'll make decisions that are not forced by anything the business expert said. If they understand the "why" behind statements, they're more likely to make sensible decisions.

  • People should ask about exceptions: "when is it not done like that?" It's the exceptions that make rules tricky, and the exceptions that will drive the creative part of programming.

    Now, it's awfully easy to ask an expert for exceptions to the rules, much harder for the expert to think of them. So there are tactics for eliciting exceptions (as well as new rules and new domain knowledge).

    • Ask for stories from different points of view. The most natural point of view is probably that of a user of the system. So find opportunities to ask for the story of a medical case from the first call to get an appointment to the last time someone touches its record. Or look at the path of an inventory item through the system. (As an example of this, see the opening scenes of the movie Lord of War. I consider that a spoiler, but seemingly every critic saw fit to describe it.)

    • When telling the story of a user, you have the opportunity to pick a persona. Don't always use a normal one. Consider how Bugs Bunny (a trickster character, a rule breaker) would use the system. How about the Charlie Chaplin of the factory scenes in Modern Times: the completely overwhelmed worker who can't keep up? (I learned this trick from Elisabeth Hendrickson.)

    • You can also try Hans Buwalda's soap opera testing (an example). In soap opera testing, you construct an example of use that bears the same relationship to normal use as a soap opera does to real life: dramatically compressed, full of cascading and overlapping implausibilities.

    • Be alert for synonyms. Suppose a clinician uses the words "release" and "discharge" in different contexts but cannot articulate the difference between them. It's natural to just pick one of them and use it henceforth. I'm more likely to want to make the system support both words (by one common routine) in the hopes that a distinction will eventually emerge.

In all of this, attend to pacing. The programmers have to learn about the domain. It's easy to overwhelm them with exceptions and special cases while they're still trying to grapple with the basics. So start with basic examples and consider the elaborations once they've demonstrated (through working code) that they're ready for them.

Give the product expert hands-on fast feedback. Anything written down (like examples, tests, or requirements document) puts the reader at one remove from the actual thing. Consider the difference between test-driving a car and reading about a test drive of a car. So quickly implement something for the product owner to look at. That will allow her to correct previous examples and also learn more about how to communicate with the team.

It's also important for everyone to work the product. You don't learn woodworking by looking at examples and listening to someone talking about woodworking. You learn by working wood. The programmers, testers, etc. on a team don't need to become experts in the business domain, but they do need to learn about it (again, so they can make unforced choices well). Having people use the product realistically, especially in pairs, especially with the business expert near, will help them. I recommend exploratory testing techniques. James Bach's site is the best place to learn about them.

I think of the team as building a trading language. This is a language that two cultural groups use to cooperate on a common goal. (See also boundary objects.) In a trading language, the programmers and business expert will both use words like "bond" or "case" -- indeed, it's best if those words are reified in code -- but they will inevitably mean different things by them. It's important to accept that, but also to attend to cases where the different meanings are causing problems. I happen to also think that the business expert should become conversant in the technology domain, just as programmers become conversant in the business domain. That doesn't mean to become a programmer, but it does mean to come to understand enough of the implementation to understand implementation difficulties and opportunities.

Finally, since understanding is built, not simply acquired, it's important to attend to learning through frequent mini-retrospectives. Is the development side of the team learning the domain? Is the business side learning about the implementation? Is the business side learning about the domain? -- I think any project where the business expert doesn't gain new insights into the business is one that's wasted an opportunity. Is everyone on the team learning about communication?

## Posted at 07:39 in category /agile [permalink] [top]

About Brian Marick
I consult mainly on Agile software development, with a special focus on how testing fits in.

Contact me here: marick@exampler.com.

 

Syndication

 

Agile Testing Directions
Introduction
Tests and examples
Technology-facing programmer support
Business-facing team support
Business-facing product critiques
Technology-facing product critiques
Testers on agile projects
Postscript

Permalink to this list

 

Working your way out of the automated GUI testing tarpit
  1. Three ways of writing the same test
  2. A test should deduce its setup path
  3. Convert the suite one failure at a time
  4. You should be able to get to any page in one step
  5. Extract fast tests about single pages
  6. Link checking without clicking on links
  7. Workflow tests remain GUI tests
Permalink to this list

 

Design-Driven Test-Driven Design
Creating a test
Making it (barely) run
Views and presenters appear
Hooking up the real GUI

 

Popular Articles
A roadmap for testing on an agile project: When consulting on testing in Agile projects, I like to call this plan "what I'm biased toward."

Tacit knowledge: Experts often have no theory of their work. They simply perform skillfully.

Process and personality: Every article on methodology implicitly begins "Let's talk about me."

 

Related Weblogs

Wayne Allen
James Bach
Laurent Bossavit
William Caputo
Mike Clark
Rachel Davies
Esther Derby
Michael Feathers
Developer Testing
Chad Fowler
Martin Fowler
Alan Francis
Elisabeth Hendrickson
Grig Gheorghiu
Andy Hunt
Ben Hyde
Ron Jeffries
Jonathan Kohl
Dave Liebreich
Jeff Patton
Bret Pettichord
Hiring Johanna Rothman
Managing Johanna Rothman
Kevin Rutherford
Christian Sepulveda
James Shore
Jeff Sutherland
Pragmatic Dave Thomas
Glenn Vanderburg
Greg Vaughn
Eugene Wallingford
Jim Weirich

 

Where to Find Me


Software Practice Advancement

 

Archives
All of 2006
All of 2005
All of 2004
All of 2003

 

Join!

Agile Alliance Logo