Tue, 11 Oct 2005
I started my PNSQC talk by asking for three volunteers. I handed each a Snickers bar and told them to eat it. After they did, I asked whether they were confident their body would be successful at converting that food into glucose and replenished fat cells. Then I gave them part of the CSS specification. I asked them whether they thought they could be successful at converting that information into a conformant implementation. Unsurprisingly, they thought digestion would work and programming wouldn't. How odd, I said, that digestion and absorption works so much better than the simpler process of programming.
The idea here was to set the stage for an attack on the idea that (1) we can adequately represent the world with words or concepts, and (2) we can transmit understanding by encoding it into words, shooting it over a conduit to another person, and having them decode it into the same understanding.
Things did not go exactly as planned. After I gave them the Snickers bars, I was surprised when they balked and asked me all kinds of questions about eating it. I thought they were deliberately giving me a hard time, but one of them (Jonathan Bach) later told me that he was honestly confused. He said something like, "it would have been much clearer if you'd shown us what you wanted by eating one yourself."
... if I hadn't tried to transmit understanding down the conduit...
... if I'd explained ambiguous words with an example. In a talk about the importance of explaining with examples.
I'm glad Jonathan was clever enough to catch that, because the irony of it all would have forever escaped me.
P.S. It now occurs to me that another problem was that they didn't know why they were to do it. That's something I also covered in the talk: "justify the rules" from the list of tactics. I don't mind not telling them why, since telling them would have spoiled the effect, but not using an example just makes me slap my head.
In a few hours, I'll be giving a presentation at PNSQC. It's on communication between the business experts and the development team. After some audience participation involving Snickers® bars, trapezes, and Silly Putty® (actually, only Snickers bars) and some airy-fairy theorizing, I get down to discussion of 16 tactics. Here they are.
When it comes to teaching programmers and testers about a domain, examples matter more than requirements. It's fine to have statements like "always fill stalls up starting with the lowest number, except for the sand stalls and the bull stall". But when it comes time to be precise about what that means, use examples (aka tests). I think of requirements statements as commentaries on, or annotations of, examples.
It's best to build examples with talk around a whiteboard. There, a business expert explains an example to a small audience of programmers, testers, technical writers, etc. The conversation includes these elements:
In all of this, attend to pacing. The programmers have to learn about the domain. It's easy to overwhelm them with exceptions and special cases while they're still trying to grapple with the basics. So start with basic examples and consider the elaborations once they've demonstrated (through working code) that they're ready for them.
Give the product expert hands-on fast feedback. Anything written down (like examples, tests, or requirements document) puts the reader at one remove from the actual thing. Consider the difference between test-driving a car and reading about a test drive of a car. So quickly implement something for the product owner to look at. That will allow her to correct previous examples and also learn more about how to communicate with the team.
It's also important for everyone to work the product. You don't learn woodworking by looking at examples and listening to someone talking about woodworking. You learn by working wood. The programmers, testers, etc. on a team don't need to become experts in the business domain, but they do need to learn about it (again, so they can make unforced choices well). Having people use the product realistically, especially in pairs, especially with the business expert near, will help them. I recommend exploratory testing techniques. James Bach's site is the best place to learn about them.
I think of the team as building a trading language. This is a language that two cultural groups use to cooperate on a common goal. (See also boundary objects.) In a trading language, the programmers and business expert will both use words like "bond" or "case" -- indeed, it's best if those words are reified in code -- but they will inevitably mean different things by them. It's important to accept that, but also to attend to cases where the different meanings are causing problems. I happen to also think that the business expert should become conversant in the technology domain, just as programmers become conversant in the business domain. That doesn't mean to become a programmer, but it does mean to come to understand enough of the implementation to understand implementation difficulties and opportunities.
Finally, since understanding is built, not simply acquired, it's important to attend to learning through frequent mini-retrospectives. Is the development side of the team learning the domain? Is the business side learning about the implementation? Is the business side learning about the domain? -- I think any project where the business expert doesn't gain new insights into the business is one that's wasted an opportunity. Is everyone on the team learning about communication?