Exploration Through Example

Example-driven development, Agile testing, context-driven testing, Agile programming, Ruby, and other things of interest to Brian Marick
191.8 167.2 186.2 183.6 184.0 183.2 184.6

Fri, 04 Jul 2003

Fighting the last war: test automation

Round about 1985, I wrote a one-page document titled "Everything Brian Marick knows about software development". It was a bunch of bullet points. One of them read something like this: "A test that's not automated, or at least exactly repeatable manually, might as well not exist."

In the next decade, I changed my mind. That was largely due to a long email conversation with Cem Kaner, back before I'd ever met him. In the late 90's, I became one of the "anti-automation" crowd. That, despite putting a slide titled "I'm Not Against Automation!" in almost every talk about test automation I gave. Our crowd, roughly the same people as the context-driven crowd, made two main points:

  • The costs of automation are often underestimated.

  • It's a mistake to lump together unskilled following of manual checklists with skilled exploratory testing. In the former, we try to make people follow rote directions like a computer and still remain alert enough to notice bugs. People are bad at that. In exploratory testing, people don't follow a program-like script, but make use of intuition, judgment, experience, and chance discovery.

My contribution to this debate was a paper titled "When should a test be automated?" In it, I attempt to list the forces pushing toward automation and those pushing away. If you understand the forces, you can find a balance between automated tests and what I didn't yet think of as exploratory testing. You can balance cost against benefit.

Many of us in the "anti-automation" camp reacted to XP's glorification of the automated acceptance test with a sinking "oh no, not again" feeling and a general girding for battle. But I think that's a mistake, for two reasons.

First, in an XP project, more automation is appropriate. XP teams are committed to reducing the cost of automation. They also use tests for things beyond finding bugs: thinking concretely about what a program should do, supporting smooth change, etc. Those increase the benefit of testing. So the balance point is further in the direction of complete automation.

That, I think, the anti-automation crowd accepts. What bugs them is that the XP crowd doesn't accept the need for exploratory testing.

Oh, but they do. I've had two chances to introduce XPers to exploratory testing. In both cases, they were enthused. Because XP and other agile methods are full of exploration, it felt right to them. I'm immensely confident in generalizing from those people to XP as a whole. As we show XPers exploratory testing, they'll embrace it. Now, they'll likely use it differently. Sure, they'll be happy it finds bugs. But more important to XP people, I bet, will be the way it increases their understanding of the code and its possibilities, and of the domain and its quirks, and of the users and their desires. Automated tests are a way to decide how to move forward in the short term (this task, this iteration) and a way to make it so that such movement is almost always truly forward. Exploratory tests are a way to map out the territory in the longer term (next iteration and beyond).

So I declare that we in the anti-automation testing crowd needn't fight that last war again. This is a different war. It's not even a war. It's closer to what I call "a heated agreement". Time to move on.

## Posted at 14:06 in category /agile [permalink] [top]

About Brian Marick
I consult mainly on Agile software development, with a special focus on how testing fits in.

Contact me here: marick@exampler.com.




Agile Testing Directions
Tests and examples
Technology-facing programmer support
Business-facing team support
Business-facing product critiques
Technology-facing product critiques
Testers on agile projects

Permalink to this list


Working your way out of the automated GUI testing tarpit
  1. Three ways of writing the same test
  2. A test should deduce its setup path
  3. Convert the suite one failure at a time
  4. You should be able to get to any page in one step
  5. Extract fast tests about single pages
  6. Link checking without clicking on links
  7. Workflow tests remain GUI tests
Permalink to this list


Design-Driven Test-Driven Design
Creating a test
Making it (barely) run
Views and presenters appear
Hooking up the real GUI


Popular Articles
A roadmap for testing on an agile project: When consulting on testing in Agile projects, I like to call this plan "what I'm biased toward."

Tacit knowledge: Experts often have no theory of their work. They simply perform skillfully.

Process and personality: Every article on methodology implicitly begins "Let's talk about me."


Related Weblogs

Wayne Allen
James Bach
Laurent Bossavit
William Caputo
Mike Clark
Rachel Davies
Esther Derby
Michael Feathers
Developer Testing
Chad Fowler
Martin Fowler
Alan Francis
Elisabeth Hendrickson
Grig Gheorghiu
Andy Hunt
Ben Hyde
Ron Jeffries
Jonathan Kohl
Dave Liebreich
Jeff Patton
Bret Pettichord
Hiring Johanna Rothman
Managing Johanna Rothman
Kevin Rutherford
Christian Sepulveda
James Shore
Jeff Sutherland
Pragmatic Dave Thomas
Glenn Vanderburg
Greg Vaughn
Eugene Wallingford
Jim Weirich


Where to Find Me

Software Practice Advancement


All of 2006
All of 2005
All of 2004
All of 2003



Agile Alliance Logo