Exploration Through Example

Example-driven development, Agile testing, context-driven testing, Agile programming, Ruby, and other things of interest to Brian Marick
191.8 167.2 186.2 183.6 184.0 183.2 184.6

Mon, 07 Mar 2005

That pernicious sense of completeness

Here's a mistake that seems easy to make.

  1. You have a large test suite. In any given run, a lot of the tests fail. Some of the tests fail because they are signaling a bug. But others fail for incidental reasons. The classic example is that the test drives the GUI, something about the GUI changes, so the test fails before it gets to the thing it's trying to test.

  2. Because so many of the failures are spurious, people don't look at new failures: it's too likely to be a waste of time. So the test suite is worthless.

  3. Someone comes up with an idea for factoring out incidental information, leaving the tests stripped down to their essence. That way, when the GUI changes, the corresponding test change will have to be made in only one place.

  4. Someone begins rewriting the test suite into the new format.

It's that last step that seems to me a mistake, in two ways.

  1. The majority of the tests don't fail. There's no present value in rewriting such a test into the new format. There's only value if that test would have someday failed because of some GUI change.

    Rather than a once-and-for-all rewrite, I prefer to let the test suite demand change. When a test fails for an incidental reason, I'll fix it. If it continues to run in the old format, I'll leave it alone. Over time, on demand, the test suite will get converted. And in the steady state, new failures are worth looking at. They're either a bug or a reason to convert a test.

  2. The "convert them all and get it over with" approach also falls prey to what James Bach has called the "wise oak tree" myth. There's an assumption that each test in the test suite is valuable just because someone once found it worth writing. But what's worth writing may not be worth rewriting.

    If you're examining tests on demand, it's easier to make a case-by-case judgment. For each test, you can decide to fix it or throw it away. Does this failing test's expected future value justify bringing it back to life?

For more on this way of thinking, see my "When should a test be automated?" (pdf). Some of the assumptions are dated, but I'm still fond of the chain of reasoning. It can be applied to more modern assumptions.

## Posted at 07:22 in category /testing [permalink] [top]

About Brian Marick
I consult mainly on Agile software development, with a special focus on how testing fits in.

Contact me here: marick@exampler.com.

 

Syndication

 

Agile Testing Directions
Introduction
Tests and examples
Technology-facing programmer support
Business-facing team support
Business-facing product critiques
Technology-facing product critiques
Testers on agile projects
Postscript

Permalink to this list

 

Working your way out of the automated GUI testing tarpit
  1. Three ways of writing the same test
  2. A test should deduce its setup path
  3. Convert the suite one failure at a time
  4. You should be able to get to any page in one step
  5. Extract fast tests about single pages
  6. Link checking without clicking on links
  7. Workflow tests remain GUI tests
Permalink to this list

 

Design-Driven Test-Driven Design
Creating a test
Making it (barely) run
Views and presenters appear
Hooking up the real GUI

 

Popular Articles
A roadmap for testing on an agile project: When consulting on testing in Agile projects, I like to call this plan "what I'm biased toward."

Tacit knowledge: Experts often have no theory of their work. They simply perform skillfully.

Process and personality: Every article on methodology implicitly begins "Let's talk about me."

 

Related Weblogs

Wayne Allen
James Bach
Laurent Bossavit
William Caputo
Mike Clark
Rachel Davies
Esther Derby
Michael Feathers
Developer Testing
Chad Fowler
Martin Fowler
Alan Francis
Elisabeth Hendrickson
Grig Gheorghiu
Andy Hunt
Ben Hyde
Ron Jeffries
Jonathan Kohl
Dave Liebreich
Jeff Patton
Bret Pettichord
Hiring Johanna Rothman
Managing Johanna Rothman
Kevin Rutherford
Christian Sepulveda
James Shore
Jeff Sutherland
Pragmatic Dave Thomas
Glenn Vanderburg
Greg Vaughn
Eugene Wallingford
Jim Weirich

 

Where to Find Me


Software Practice Advancement

 

Archives
All of 2006
All of 2005
All of 2004
All of 2003

 

Join!

Agile Alliance Logo