Exploration Through Example

Example-driven development, Agile testing, context-driven testing, Agile programming, Ruby, and other things of interest to Brian Marick
191.8 167.2 186.2 183.6 184.0 183.2 184.6

Mon, 12 Dec 2005

Working your way out of the automated GUI testing tarpit (part 3)

part 1, part 2

In the real world, you can't leap out of a tarpit in one bound. The same is true of a metaphorical tarpit. Here's a scenario to avoid:

  • You have 2500 tests. At any given moment, some 200 of them are failing. Most of the failures are because of irrelevant interface changes, not because the code has a bug. As a result, hardly anyone looks at the failing tests.

  • Someone invents a much more compact, much more maintainable way of writing tests.

  • Someone (likely that same person) is assigned the task of rewriting all the tests in the new form.

  • She gets through about 300 before something urgent comes up. Rewriting the tests becomes a background task. So tedious was it that somehow it never makes it back to the foreground.

  • A year later, you have 2500 tests. 336 of them are rewritten (perhaps not the most important ones—no one knows which of the old suite are the important ones). At any given moment, those 336 are trustworthy, but 173 of of the unconverted tests are failing for the same old reason. No one looks at those tests.

Even if the task is plowed through to the end, it has not changed the habits of the team, so there's no counterforce to whatever forces caused the problem in the first place. I'm with William James on the importance of habit:

only when habits of order are formed can we advance to really interesting fields of action [...] consequently accumulate grain on grain of willful choice like a very miser; never forgetting how one link dropped undoes an indefinite number.

Therefore, my bias is toward having everyone convert the test suite one failure at a time:

  • As part of every story, spend around 20 minutes fixing failing tests in the untrustworthy suite. You're probably better off just fixing the next failing one than trying to find which one is most worth fixing.

    If a test has found a legitimate bug, either fix that bug immediately (if the fix doesn't take long) or put it on the backlog to be scheduled as a story.

  • Fixed tests get moved over to a reliable suite. That suite is run as part of the continuous integration build. No story is done if any of those tests fail. (I would not include tests for backlog bugs in this suite.)

  • This process continues ad infinitum. You may never eliminate the untrustworthy suite. If some test there never fails, it will never get converted.

Some fraction — perhaps a large fraction — of the old tests are likely to be worthless. (More precisely, they're worth less than the cost of reviving them.) It's hard to persuade people to throw away tests, but nonetheless I'd try. (There are unknown risks to throwing tests away. My bias would be to do it and let the reality of escaped bugs make the risks better known. Tests can always be un-thrown away by retrieving them from Subversion.)

A tempting alternative is simply to delete the old test suite and start over. Spend the 20 minutes writing a new test instead of reviving a failed one. That might well be time better spent. But it's a tough sell because of the sunk cost fallacy.

## Posted at 13:06 in category /testing [permalink] [top]

UI design links from Jared M. Spool

What makes a design intuitive? is nice and readable short article about the two ways to make an interface that people will call intuitive.

Designing embraceable change is a follow-on that talks about how to introduce a new UI to an existing community. This has relevance to Agile projects that are continually tinkering with the UI.

The series ends with The quiet death of the major relaunch. Here's a trivial example of the approach:

At eBay, they learned the hard way that their users don't like dramatic change. One day, the folks at eBay decided they no longer liked the bright yellow background on many of their pages, so they just changed it to a white background. Instantly, they started receiving emails from customers, bemoaning the change. So many people complained, that they felt forced to change it back.

Not content with the initial defeat, the team tried a different strategy. Over the period of several months, they modified the background color one shade of yellow at a time, until, finally, all the yellow was gone, leaving only white. Predictably, hardly a single user noticed this time.

The key point in this last article is this:

Our findings show that consistency in the design plays second fiddle to completing the task. When users are complaining about the consistency of a site, we've found that it is often because they are having trouble completing their tasks.

## Posted at 10:22 in category /links [permalink] [top]

Agile consultants

In my role as the overcommitted and underskilled Agile Alliance webmaster, I add new corporate members to the site. I realized today that we really have quite an impressive variety there. You can find companies in out-of-the-way places (Topeka, Kansas, USA). It's less easy to find companies that have particular skills, since the blurbs don't generally focus on a company's specific competitive advantage. Nevertheless, I recommend it to you if you're looking for a consultancy.

P.S. Not me, though. Exampler Consulting isn't a corporate member because I've never gotten around to getting a logo.

P.P.S. Corporate membership was Rebecca Wirf-Brock's idea.

## Posted at 10:22 in category /agile [permalink] [top]

About Brian Marick
I consult mainly on Agile software development, with a special focus on how testing fits in.

Contact me here: marick@exampler.com.

 

Syndication

 

Agile Testing Directions
Introduction
Tests and examples
Technology-facing programmer support
Business-facing team support
Business-facing product critiques
Technology-facing product critiques
Testers on agile projects
Postscript

Permalink to this list

 

Working your way out of the automated GUI testing tarpit
  1. Three ways of writing the same test
  2. A test should deduce its setup path
  3. Convert the suite one failure at a time
  4. You should be able to get to any page in one step
  5. Extract fast tests about single pages
  6. Link checking without clicking on links
  7. Workflow tests remain GUI tests
Permalink to this list

 

Design-Driven Test-Driven Design
Creating a test
Making it (barely) run
Views and presenters appear
Hooking up the real GUI

 

Popular Articles
A roadmap for testing on an agile project: When consulting on testing in Agile projects, I like to call this plan "what I'm biased toward."

Tacit knowledge: Experts often have no theory of their work. They simply perform skillfully.

Process and personality: Every article on methodology implicitly begins "Let's talk about me."

 

Related Weblogs

Wayne Allen
James Bach
Laurent Bossavit
William Caputo
Mike Clark
Rachel Davies
Esther Derby
Michael Feathers
Developer Testing
Chad Fowler
Martin Fowler
Alan Francis
Elisabeth Hendrickson
Grig Gheorghiu
Andy Hunt
Ben Hyde
Ron Jeffries
Jonathan Kohl
Dave Liebreich
Jeff Patton
Bret Pettichord
Hiring Johanna Rothman
Managing Johanna Rothman
Kevin Rutherford
Christian Sepulveda
James Shore
Jeff Sutherland
Pragmatic Dave Thomas
Glenn Vanderburg
Greg Vaughn
Eugene Wallingford
Jim Weirich

 

Where to Find Me


Software Practice Advancement

 

Archives
All of 2006
All of 2005
All of 2004
All of 2003

 

Join!

Agile Alliance Logo