Exploration Through Example

Example-driven development, Agile testing, context-driven testing, Agile programming, Ruby, and other things of interest to Brian Marick
191.8 167.2 186.2 183.6 184.0 183.2 184.6

Thu, 11 Jan 2007

Test-driving presenter-first design

My notion of using lively wireframes as tests for a model/view/presenter style UI leads to finished tests like this:

The test is driven by an OmniGraffle Pro slide show annotated with test assertions. I used the test for test-driven development. For example, the last green box was red not long before I started this post. In order to make it green, I had to make the following changes, in roughly the order shown by the arrows:

That felt like straightforward, unexciting coding, which is what I want from TDD. The driving test is (arguably) business-facing. What of unit tests?

In model/view/presenter, you're not expected to write unit tests for the thin view. The application (model) objects are unit tested like any normal object. The presenters are typically tested by putting mocks on either side, replacing the view and application. I didn't do that. Here's why.

Consider the code that responds to the clicking of the Run button:

    @When (USER_WANTS_TO_CONVERT_A_DIRECTORY_RIGHT_NOW)
    public void runConversion(AnnouncingObject sender) {
        myView().clearResultsBox();
        myView().activateResultsBox();
        myApp().convert(myView().getInputFolder(), myView().getOutputFolder());
    }

Here's a typical presenter test, using mocks in place of the ConversionView and ConversionApp:

    public void testRunningConversion() {
        mockView.expects(once()).method("clearResultsBox")
                .withNoArguments();
        mockView.expects(once()).method("activateResultsBox")
                .withNoArguments();
        mockView.expects(once()).method("getInputFolder")
                .withNoArguments()
                .will(returnValue("/tmp/dropzone"));
        mockView.expects(once()).method("getOutputFolder")
                .withNoArguments()
                .will(returnValue("/tmp/upload"));
 
        mockApp.expects(once()).method("convert")
                .with(eq("/tmp/dropzone"),
                      eq("/tmp/upload"));
 
        announce(USER_WANTS_TO_CONVERT_A_DIRECTORY_RIGHT_NOW);
    }

My first impression, looking at that, is that it's too much work for the code it tests. That's probably overstating it, since the test is stylized and straightforward to write. In fact, it's too straightforward: the test and the code are mechanical transformations of one another. Moreover, the transformation happens in one step (one test ⇒ one complete straight-line method) because all the "iffyness" of the code gets factored out into a profusion of different methods and the declarations of which announcements each responds to. (This is like the way switch statements can be factored into objects of different classes.)

Because of all this, the unit test, even if written first, seems to lack the idea-generation virtues a unit test ought to have. You're not interspersing the coding of a method's internals with thinking about what visible behavior it should have. The behavior that matters is dispersed, and the method's internals are its behavior (since all it's for is telling other objects what to do).

It's the wireframe test, not the unit test, that produces Aha! moments. It forces you to think about what counts: "when the user pokes at this button here, what should happen to all the bits of UI the user can see?" Before I thought of the idea of wireframe tests, I found it easy to overlook that a change in one window ought to produce changes in another. Nothing rubbed it in my face like the wireframes do.

However, these wireframe tests look an awful lot like traditional GUI tests, and they may have their great weakness: many different tests share knowledge of the UI, so a product change that deliberately falsifies a bit of that knowledge will break many tests. I have some ideas about dealing with that problem in a way that GUI-driving tests cannot. Will they work out? Who knows?

My development preference is probably unchanged: put off the UI (and especially UI tweaking) in favor of getting the business logic right. In the case of this program, I did a lot of work on the conversions before there was anything more than the crudest possible command-line UI. However, I've noticed and heard something in the past couple years: the trust of the business people is driven by how well the UI matches what they imagine of the finished product. Consider the novice product director—which is most of them, these days, at the start of projects. Thrust into a new situation, promised early and frequent delivery of business value, and largely unable to distinguish "the product" from "the UI", she demands—and gets—a UI first. I have faith that many product directors can, in time, come to see the product as being about business rules rather than about UI. But by the time, any damage due to working UI-first will have been done. Therefore, I think it prudent to find ways to make what the business wants (screen images) serve the team's need to have tests drive their code. That's why I'm hot on wireframes.

## Posted at 16:44 in category /fit [permalink] [top]

About Brian Marick
I consult mainly on Agile software development, with a special focus on how testing fits in.

Contact me here: marick@exampler.com.

 

Syndication

 

Agile Testing Directions
Introduction
Tests and examples
Technology-facing programmer support
Business-facing team support
Business-facing product critiques
Technology-facing product critiques
Testers on agile projects
Postscript

Permalink to this list

 

Working your way out of the automated GUI testing tarpit
  1. Three ways of writing the same test
  2. A test should deduce its setup path
  3. Convert the suite one failure at a time
  4. You should be able to get to any page in one step
  5. Extract fast tests about single pages
  6. Link checking without clicking on links
  7. Workflow tests remain GUI tests
Permalink to this list

 

Design-Driven Test-Driven Design
Creating a test
Making it (barely) run
Views and presenters appear
Hooking up the real GUI

 

Popular Articles
A roadmap for testing on an agile project: When consulting on testing in Agile projects, I like to call this plan "what I'm biased toward."

Tacit knowledge: Experts often have no theory of their work. They simply perform skillfully.

Process and personality: Every article on methodology implicitly begins "Let's talk about me."

 

Related Weblogs

Wayne Allen
James Bach
Laurent Bossavit
William Caputo
Mike Clark
Rachel Davies
Esther Derby
Michael Feathers
Developer Testing
Chad Fowler
Martin Fowler
Alan Francis
Elisabeth Hendrickson
Grig Gheorghiu
Andy Hunt
Ben Hyde
Ron Jeffries
Jonathan Kohl
Dave Liebreich
Jeff Patton
Bret Pettichord
Hiring Johanna Rothman
Managing Johanna Rothman
Kevin Rutherford
Christian Sepulveda
James Shore
Jeff Sutherland
Pragmatic Dave Thomas
Glenn Vanderburg
Greg Vaughn
Eugene Wallingford
Jim Weirich

 

Where to Find Me


Software Practice Advancement

 

Archives
All of 2006
All of 2005
All of 2004
All of 2003

 

Join!

Agile Alliance Logo