Exploration Through Example

Example-driven development, Agile testing, context-driven testing, Agile programming, Ruby, and other things of interest to Brian Marick
191.8 167.2 186.2 183.6 184.0 183.2 184.6

Tue, 11 Mar 2003

I'm a Bayesian Filter

Which spam filter are you?

## Posted at 15:11 in category /junk [permalink] [top]

Requirements and the conduit metaphor

In an influential paper, Michael Reddy argues that English speakers (at least) have a folk theory of communication that he calls the "conduit metaphor". (Sorry, couldn't find any good links online, but you can try this and this.) The theory goes like this. Communication is about transferring thoughts (likened to objects) from one mind to another. The first person puts his ideas into words, sends those words to the second, who then extracts the ideas out of the words. This metaphor infects our speech:

  • You've got to get this idea across to her.
  • I gave you that idea.
  • It's hard to put this idea into words.
  • Something was lost in translation.
  • The thought's buried in that convoluted prose.
  • I remember a book with that idea in it.
  • That talk went right over my head.

Why bring it up? Well, I have difficulty putting across (ahem) the idea that tests can serve as well as requirements to drive development. Part of the reason may be that requirements fit the conduit metaphor, and tests do not.

What is a requirements document? It is a description of a problem and the constraints on its solution, boiled down to its essence, complete, stripped of ambiguity - made into the smallest, most easily transmissable package. Send that across the conduit and, by gum, you've sent the idea in its optimal form.

Let me contrast the conduit metaphor to something I've heard Dick Gabriel say of poetry: that a poem is a program that runs in the reader's mind. Let me extrapolate from that (perhaps offhand) comment. What happens when the program runs isn't predictable. It depends on its inputs, whatever's already there in the mind. Extrapolating further, an effective communication is a program that provokes desirable responses from the recipient. It need not contain the information; we could alternately describe it as generating it.

An effective set of tests is one that provokes a programmer to write the right program.

To do that, the tests needn't necessarily describe the right program in any logical sense. That addresses the most common objection I hear to pure test-driven development (one where there's no additional up-front documentation), which is some variant of "No number of white swans proves the proposition 'all swans are white'." That's to say, what prevents the programmer from writing a program that passes exactly and only the tests given, but fails on all other inputs?

The answer is that what prevents it is not the "content" of the communication, but rather the context in which the communication takes place:

  • Everyone has practice generalizing from examples. What generalizations will this programmer make?

  • What will she assume?

  • What conversations will she have that might correct her mistaken assumptions?

  • Is there writing that can supplement the tests? (I annoy people by referring to requirements documents as "annotations to the tests", with the implication that they should only explain what needs explaining, not attempt to be complete.)

  • When the wrong assumptions get turned into code, how big a deal is it? Is it readily noticed? Readily fixed?

  • ...

That seems unsatisfying: throw a logically incomplete set of test cases into the mix and hope that the programmer reacts correctly. Why not just send over all the information in a form so perfect that the only possible reaction is the correct one? Well, we've spent a zillion years trying to write unambiguous requirements, requirements that cause a programmer to make the same decisions the requirements writer would have made. It's Just Too Hard to be proposed as a universal practice. Pragamatically, I think many of us will do better to improve our skill at writing and annotating tests.

"The conduit metaphor: A case of frame conflict in our language about language", Michael J. Reddy, in Metaphor and Thought, Andrew Ortony (ed.)

## Posted at 15:11 in category /agile [permalink] [top]

About Brian Marick
I consult mainly on Agile software development, with a special focus on how testing fits in.

Contact me here: marick@exampler.com.

 

Syndication

 

Agile Testing Directions
Introduction
Tests and examples
Technology-facing programmer support
Business-facing team support
Business-facing product critiques
Technology-facing product critiques
Testers on agile projects
Postscript

Permalink to this list

 

Working your way out of the automated GUI testing tarpit
  1. Three ways of writing the same test
  2. A test should deduce its setup path
  3. Convert the suite one failure at a time
  4. You should be able to get to any page in one step
  5. Extract fast tests about single pages
  6. Link checking without clicking on links
  7. Workflow tests remain GUI tests
Permalink to this list

 

Design-Driven Test-Driven Design
Creating a test
Making it (barely) run
Views and presenters appear
Hooking up the real GUI

 

Popular Articles
A roadmap for testing on an agile project: When consulting on testing in Agile projects, I like to call this plan "what I'm biased toward."

Tacit knowledge: Experts often have no theory of their work. They simply perform skillfully.

Process and personality: Every article on methodology implicitly begins "Let's talk about me."

 

Related Weblogs

Wayne Allen
James Bach
Laurent Bossavit
William Caputo
Mike Clark
Rachel Davies
Esther Derby
Michael Feathers
Developer Testing
Chad Fowler
Martin Fowler
Alan Francis
Elisabeth Hendrickson
Grig Gheorghiu
Andy Hunt
Ben Hyde
Ron Jeffries
Jonathan Kohl
Dave Liebreich
Jeff Patton
Bret Pettichord
Hiring Johanna Rothman
Managing Johanna Rothman
Kevin Rutherford
Christian Sepulveda
James Shore
Jeff Sutherland
Pragmatic Dave Thomas
Glenn Vanderburg
Greg Vaughn
Eugene Wallingford
Jim Weirich

 

Where to Find Me


Software Practice Advancement

 

Archives
All of 2006
All of 2005
All of 2004
All of 2003

 

Join!

Agile Alliance Logo