Exploration Through Example

Example-driven development, Agile testing, context-driven testing, Agile programming, Ruby, and other things of interest to Brian Marick
191.8 167.2 186.2 183.6 184.0 183.2 184.6

Sun, 25 Jun 2006

RubyConf proposals

David A. Black reminds me to remind you that RubyConf proposals are due June 30. Here's the proposal link: http://proposals.rubygarden.org/.

## Posted at 10:24 in category /ruby [permalink] [top]

Agile with mainframes

I have a client that has many, many mainframes. Every project I might coach involves mainframes to a much greater extent than I've experienced before. I'd like to help the mainframe people with their programming and, especially, testing. If anyone has experience reports for me to read or stories to tell me, please do. I've already ordered Agile Database Techniques and Refactoring Databases.

I will set something up on the topic at Agile2006, both in the Open Space sessions and in the Agile 2006 Fringe (to be explained later).

I'll summarize anything I find out. If you like to write, your experience might fit in either Better Software or Agile Times.

## Posted at 10:13 in category /agile [permalink] [top]

Balancing forces in business-facing tests

This week, I gave seven (!) presentations of a live demo of testing and design in an Agile project. I started with a product director's idea for a story; showed the business-facing tests used to nail down that idea for the programmers; demonstrated how a programmer can use testing to make every step a small, safe, checked one; and ended (in some versions) with a working feature to be demoed and then manually tested (in an exploratory style). The idea was to get across a gut feel for how development feels, plus show some key principles in action.

Here's something that really came into focus as I (at first) kept radically changing the presentation and (later) tweaked it:

  1. A business-facing test describes some facts that should be true of the new feature. (The feature uses this business logic, or appears something like this on the screen, or is used in this workflow.)

  2. The product owner must be able to read the test to check that the team has captured the most important parts of the conversation in which the feature was described. (But note that the document does not end the conversation.)

  3. At any given moment, the test document can be used to ask which of the facts it describes actually are true of the product. (That is, it's executable.) That property lets programmers use it to drive programming.

I expect product directors to read these documents collaboratively, sitting down with at least one programmer or tester. So the product director has to be semi-comfortable with the notation. (I also like it if that notation lends itself to looking at the feature in a different way. For example, a tabular notation for state machine designs encourages you to think through more cases than a node-and-arc notation does. That's also why Fit tests are good for business rules.)

So we want readability by a non-technical audience. However, the need for the documents to be executable pushes the notation in the direction of the product's implementation language.

It's balancing those two forces that's the trick.

There are two other sets of forces to balance:

Fragility vs. comprehensiveness

The more detail there is in the test, the more fragile it becomes. That means a change to a single fact about the program will break many tests, and the breaking of a particular test may tell you nothing new about the program. That's wasteful.

And yet, detail that is not tested may not be gotten right in the first place. If it is right, but then goes wrong, you may well not notice it.

Excess detail seems to cause the most problem in the user interface. Today, my solution is to have the tests describe intermediate results from user-experience design (as I have glancingly learned it, mainly from Jeff Patton). Today's two types are:

  • Wireframe diagrams (as shown on the right). The tests are a textual representation of the picture. I earlier showed an example, though I now believe it has too much technology-specific language and detail. (Note that there's an argument against wireframe diagrams. I interpret it as an argument against making them too early. But they make a good level of detail to hand to a programmer at some point.)

  • Tests that are akin to scenarios, workflows, or semi-detailed use cases. Here's a snippet from one of my latest examples:

        in_sidebar {

    Notice that the test is (almost) exclusively about how a user moves from one place to another and what information she uses in each place. It's deliberately vague about the form that information takes: what does the notification look like? How does Adam jump? (A link? a button?) Those details will come, in part, from the wireframe test, or they will not be written down in a test at all.

    The details not worth writing down are those that (1) are easily communicated when two people—one a programmer— look at a page, point at page elements, and talk about what they should look like; and (2) are highly unlikely to be changed by accident when working on some other part of the program. (See the latter parts of "When should a test be automated?" for a discussion of that last.)

People vs. process

I sometimes refer to myself as a "recovering abstracter." I used to jump to abstractions way too fast. Now I believe in building them gradually by implementing examples.

Neverthess, abstractions are important. In many programs, the real value comes from the business logic. Those are abstractions (of what's already worked for the business, I hope). All of my tests above abstract out detail. More importantly, the story of a project's ubiquitous language is one of developing shared abstractions.

But the majority of business people, it seems, are not practiced at thinking in abstractions (at least, our kind of abstractions). Notoriously, they want to see the user interface right away, they want it to be pretty (that is, detailed), and they want to talk in terms of what's on a screen rather than the concepts behind it. Their desire to do that conflicts with our desire to abstract away fragile and confusing detail.

We need to strike a balance. Over time, we need to show them that they can get what they want from us more easily if they tolerate our need to write things down in wierd and hard-to-visualize notations. (It worries me that I don't see what we're giving up in exchange.)

## Posted at 10:01 in category /agile [permalink] [top]

About Brian Marick
I consult mainly on Agile software development, with a special focus on how testing fits in.

Contact me here: marick@exampler.com.




Agile Testing Directions
Tests and examples
Technology-facing programmer support
Business-facing team support
Business-facing product critiques
Technology-facing product critiques
Testers on agile projects

Permalink to this list


Working your way out of the automated GUI testing tarpit
  1. Three ways of writing the same test
  2. A test should deduce its setup path
  3. Convert the suite one failure at a time
  4. You should be able to get to any page in one step
  5. Extract fast tests about single pages
  6. Link checking without clicking on links
  7. Workflow tests remain GUI tests
Permalink to this list


Design-Driven Test-Driven Design
Creating a test
Making it (barely) run
Views and presenters appear
Hooking up the real GUI


Popular Articles
A roadmap for testing on an agile project: When consulting on testing in Agile projects, I like to call this plan "what I'm biased toward."

Tacit knowledge: Experts often have no theory of their work. They simply perform skillfully.

Process and personality: Every article on methodology implicitly begins "Let's talk about me."


Related Weblogs

Wayne Allen
James Bach
Laurent Bossavit
William Caputo
Mike Clark
Rachel Davies
Esther Derby
Michael Feathers
Developer Testing
Chad Fowler
Martin Fowler
Alan Francis
Elisabeth Hendrickson
Grig Gheorghiu
Andy Hunt
Ben Hyde
Ron Jeffries
Jonathan Kohl
Dave Liebreich
Jeff Patton
Bret Pettichord
Hiring Johanna Rothman
Managing Johanna Rothman
Kevin Rutherford
Christian Sepulveda
James Shore
Jeff Sutherland
Pragmatic Dave Thomas
Glenn Vanderburg
Greg Vaughn
Eugene Wallingford
Jim Weirich


Where to Find Me

Software Practice Advancement


All of 2006
All of 2005
All of 2004
All of 2003



Agile Alliance Logo