Exploration Through Example

Example-driven development, Agile testing, context-driven testing, Agile programming, Ruby, and other things of interest to Brian Marick
191.8 167.2 186.2 183.6 184.0 183.2 184.6

Wed, 15 Mar 2006

Two formative experiences

Why is it that I so stubbornly believe that code can get more and more malleable over time? — Two early experiences, I think, that left a deep imprint on my soul.

I've earlier told the story of Gould Common Lisp. The short version is that, over a period of one or two years, I wrote thousands of lines of debugging support code for the virtual machine. Most of it was to help me with an immediate task. For example, because we were not very skilled, a part of implementing each bytecode was to snapshot all of virtual memory,* run the bytecode in a unit test, snapshot all of virtual memory again, diff the snapshots, and check that the bytecode changed only what it should have.

The program ended up immensely chatty (to those who knew how to loosen its tongue). There are two questions any parent with more than one child has asked any number of times: "all right, who did it?" and "what on earth was going through your mind to make you think that was a good idea?" The Lisp VM was much better at answering those questions than any human child.

I was only three or so years out of college, still young and impressionable. Because that program had been a pleasure to work with, I came to think that of course that was the way things ought to be.

Later, I worked on a version of the Unix kernel, one that Dave Fields called the Winchester Mystery Kernel, so I became well aware that not all programs were that way. But at the time, I was also a maintainer of GNU Emacs for the Gould PowerNode. With each major release of Emacs, I made whatever changes were required to make it run on the PowerNode, then I fed those changes back to Stallman. Part of what I worked on was unexec.c, which is the code that dumped a version of Emacs into an executable file after a bunch of Lisp libraries had been loaded. As you might imagine, the code was highly machine-dependent. Since Emacs ran on a gazillion different machines, you'd expect that file to turn into a maze of twisty little #ifdefs, all different. But an amazing thing happened: over time, it got cleaner and cleaner and cleaner. Instead of surrendering to complexity, Stallman and company used patches for new machines to help them find the right abstractions to keep things tractable.

That experience also made an impression on me, and probably accounts for a tic of mine, which is to hope that each change will give me an excuse to learn something new about the way the program ought to be.

I've been lucky in the experiences I've had. A big part of my luck was being left alone. No one really cared that much about the Lisp project, so no one really noticed that I was writing a lot of code that satisfied no user requirement. GNU Emacs was something I did on my own time, not as part of my Professional Responsibilities, so no one really noticed that Stallman pushed harder for good code than those people who were paid to push hard for good code.

I'm not sure whether people on the Agile projects of today have it better or worse. On the one hand, ideas like the above are no longer so unusual, so it's easier to find yourself in situations where you're allowed to indulge them. On the other hand, people's actions are much more visible, and they tend to be much more dedicated to meeting deadlines—deadlines that are always looming. I'm wondering these days whether I'm disenchanted with one-week iterations. I believe that the really experienced team can envision a better structure and move toward it in small, safe steps that add not much time to most every story. I'm not good enough to do that. I need time floundering around. To get things right, I need to be unafraid of taking markedly more time on a story than is needed to get it done with well-tested code that's not all that far from what I wish it were (but makes the effort to get there one story bigger and so one story less likely to be spent). It's tough to be unafraid when you're never more than four days from a deadline.

So I think I see teams that are self-inhibiting. When I work with programmers (more so than with testers), I find it difficult to calibrate how much to push. My usual instinct is to come on all enthusiastic and say, "Hey, why don't we merge these six classes into one, or maybe two, because they're so closely related, then see what forces—if any—push new classes out?" But then I realize (a) I'm a pretty rusty programmer, (b) I know their system barely at all, (c) they'll have to clean up any mess we make, not me, and (d) there's an iteration deadline a few days away and a release deadline not too far beyond that. So I don't want to push too hard. But if I don't, someone's paying me an awful lot of money to join the team for a week as a rusty programmer who knows their system barely at all.

It ought to be easier to focus just on testing, but the same thing crops up. There, the usual pattern goes like this: I like to write business-facing tests that use fairly abstract language (nowadays usually implemented in the same language as the system under test). My usual motto is that I want to see few words in the test that aren't directly relevant to its purpose. Quite often, that makes the test a mismatch for the current structure of the system. It's a lot of work to write the utility routines (fixtures) that map from business-speak to implementation-speak. Now, it's an article of faith with me that one or both of two things will probably happen. Either we'll discover that the fixture code is usefully pushed down into the application, or a rejiggering of the application to make the fixtures more straightforward will make for a better design. But..., (a) if I'm wrong, someone else will have to clean up the mess (or, worse, decide to keep those tests around even though they turned out to be a bad idea), and (b) this is going to be a lot of work for a feature that could be done more easily, and (c) those deadlines are looming.

I manage to muddle through, buoyed—as I think many Agile consultants are—by memories of those times when things just clicked.

* The PowerNode only had 32M of virtual (not physical) memory, so snapshotting it was not so big a deal.

## Posted at 18:40 in category /coding [permalink] [top]

At the end of every story

A question to ask every time you finish a story: "What's now easier to do?" Be ever so slightly disappointed unless each story contributes, in some small way, to making the system more malleable.

## Posted at 04:53 in category /agile [permalink] [top]

About Brian Marick
I consult mainly on Agile software development, with a special focus on how testing fits in.

Contact me here: marick@exampler.com.

 

Syndication

 

Agile Testing Directions
Introduction
Tests and examples
Technology-facing programmer support
Business-facing team support
Business-facing product critiques
Technology-facing product critiques
Testers on agile projects
Postscript

Permalink to this list

 

Working your way out of the automated GUI testing tarpit
  1. Three ways of writing the same test
  2. A test should deduce its setup path
  3. Convert the suite one failure at a time
  4. You should be able to get to any page in one step
  5. Extract fast tests about single pages
  6. Link checking without clicking on links
  7. Workflow tests remain GUI tests
Permalink to this list

 

Design-Driven Test-Driven Design
Creating a test
Making it (barely) run
Views and presenters appear
Hooking up the real GUI

 

Popular Articles
A roadmap for testing on an agile project: When consulting on testing in Agile projects, I like to call this plan "what I'm biased toward."

Tacit knowledge: Experts often have no theory of their work. They simply perform skillfully.

Process and personality: Every article on methodology implicitly begins "Let's talk about me."

 

Related Weblogs

Wayne Allen
James Bach
Laurent Bossavit
William Caputo
Mike Clark
Rachel Davies
Esther Derby
Michael Feathers
Developer Testing
Chad Fowler
Martin Fowler
Alan Francis
Elisabeth Hendrickson
Grig Gheorghiu
Andy Hunt
Ben Hyde
Ron Jeffries
Jonathan Kohl
Dave Liebreich
Jeff Patton
Bret Pettichord
Hiring Johanna Rothman
Managing Johanna Rothman
Kevin Rutherford
Christian Sepulveda
James Shore
Jeff Sutherland
Pragmatic Dave Thomas
Glenn Vanderburg
Greg Vaughn
Eugene Wallingford
Jim Weirich

 

Where to Find Me


Software Practice Advancement

 

Archives
All of 2006
All of 2005
All of 2004
All of 2003

 

Join!

Agile Alliance Logo