Exploration Through Example

Example-driven development, Agile testing, context-driven testing, Agile programming, Ruby, and other things of interest to Brian Marick
191.8 167.2 186.2 183.6 184.0 183.2 184.6

Thu, 26 Jun 2003

Agile Development Conference - Day 1

I'm in Salt Lake City, at Agile Development Conference. So far, so fun. I got to wear a odd smoking jacket when pressed into service as a nobleman in a reading of a scene from Shakespeare's The Tempest. And my trusty PowerBook saved the day when the hotel's DVD player couldn't handle the DVD Jerry Weinberg was using in his keynote.

On the technical side, I enjoyed a Technical Exchange about customer collaboration. It was interesting how rapidly people zeroed in on the need for a "bridge role" or "business coach" to mediate/translate between the business world and the program world. Alistair Cockburn pointed out that the common notion of "Customer" mushes together four potentially distinct roles: usage expert, domain expert, product owner ("goal donor"), and executive sponsor ("gold owner").

Alistair shares my interest in how personality affects methodology. He wondered what sort of personality a business coach needs. Here's a tentative and partial answer. Testers often fancy themselves in a bridge role, using knowledge of the business and users to find bugs. So Bret Pettichord's paper, Testers Think Differently, is relevant. It talks about personality differences between testers and programmers. Three of them, it seems to me, fit for the bridge role. Here they are, somewhat distorted:

  • A happiness to be a dilettante, to be OK with having a shallow knowledge instead of deep expertise. This lets you flit between people with relevant information quickly, bringing back something useful.

  • Paying attention to a variety of people, especially those whose opinions, desires, and needs tend to be discounted (such as system administrators and technical support). People tend to get captured by a single interest group, to see the world and the product through only one set of eyes. Testers resist that.

  • What Bret calls "living with conflict", which I interpret here as being comfortable with ambiguity and lack of agreement. While the Bridge needs to keep the project moving forward by feeding the programmers stories/tests/features to implement, she shouldn't rush to resolve ambiguity, to force agreement. That agreement is all too likely to be a sham that will be a constant irritant to the project, whereas deliberately acknowledged uncertainty can sustain a creative tension that drives "Eureka!" moments.

I attended another technical exchange on extending Continuous Integration to the whole enterprise. We mainly looked at difficulties. Jeff McKenna said something that sparked something like an idea. He said that some architectures are simply wrong for continuous integration. That made me think of particular architectures and the processes of integrating them as being like the systems that Charles Perrow describes as subject to "normal accidents" (in his book of the same title). Perrow and his followers describe characteristics of systems that make them prone to accidents. Can those characteristics, or something like them, be used to describe architectures that can't be continuously integrated? Would knowing about them help us avoid those architectures?

(Here's a short writeup of Perrow's ideas.)

Sadly, merging Analogy Fest into Open Space didn't work. It sort of fizzled away. Only two of the papers are going to be discussed. My apologies to the authors who went to the effort of writing their analogies.

## Posted at 06:39 in category /agile [permalink] [top]

About Brian Marick
I consult mainly on Agile software development, with a special focus on how testing fits in.

Contact me here: marick@exampler.com.




Agile Testing Directions
Tests and examples
Technology-facing programmer support
Business-facing team support
Business-facing product critiques
Technology-facing product critiques
Testers on agile projects

Permalink to this list


Working your way out of the automated GUI testing tarpit
  1. Three ways of writing the same test
  2. A test should deduce its setup path
  3. Convert the suite one failure at a time
  4. You should be able to get to any page in one step
  5. Extract fast tests about single pages
  6. Link checking without clicking on links
  7. Workflow tests remain GUI tests
Permalink to this list


Design-Driven Test-Driven Design
Creating a test
Making it (barely) run
Views and presenters appear
Hooking up the real GUI


Popular Articles
A roadmap for testing on an agile project: When consulting on testing in Agile projects, I like to call this plan "what I'm biased toward."

Tacit knowledge: Experts often have no theory of their work. They simply perform skillfully.

Process and personality: Every article on methodology implicitly begins "Let's talk about me."


Related Weblogs

Wayne Allen
James Bach
Laurent Bossavit
William Caputo
Mike Clark
Rachel Davies
Esther Derby
Michael Feathers
Developer Testing
Chad Fowler
Martin Fowler
Alan Francis
Elisabeth Hendrickson
Grig Gheorghiu
Andy Hunt
Ben Hyde
Ron Jeffries
Jonathan Kohl
Dave Liebreich
Jeff Patton
Bret Pettichord
Hiring Johanna Rothman
Managing Johanna Rothman
Kevin Rutherford
Christian Sepulveda
James Shore
Jeff Sutherland
Pragmatic Dave Thomas
Glenn Vanderburg
Greg Vaughn
Eugene Wallingford
Jim Weirich


Where to Find Me

Software Practice Advancement


All of 2006
All of 2005
All of 2004
All of 2003



Agile Alliance Logo