Exploration Through Example

Example-driven development, Agile testing, context-driven testing, Agile programming, Ruby, and other things of interest to Brian Marick
191.8 167.2 186.2 183.6 184.0 183.2 184.6

Wed, 05 Feb 2003

Ideas for a course on programmer testing

I've been contacted by a big company that would like a lot of programmers taught programmer testing. My job would be to design the course and hand it off to an internal trainer. I think they were thinking more of conventional programmer testing (write code design + test plan, write the code, write the tests). I'm talking to them about test-first programming. Here are my talking points for a phone conference tomorrow.

There are two main issues:

  1. How should unit testing be done in a way that makes it "sticky"? Goal: unit testing is still being done, with thoroughness, in two years. Experience tells us that's not an easy goal.

  2. How can it be taught to a large group of programmers both efficiently and effectively? A large group argues for classroom teaching, but my experience has been that small groups and 1-on-1 teaching is far more effective. How to balance these?

I write about the first by way of background, then get concrete when talking about the second.

== Sticky unit testing

There seem to be three main (non-exclusive) ways to make unit testing stick:

  1. Programmers must see it as helping them. Not only must it help them deliver fewer bugs, but they must think it speeds and smooths their programming. That means:

    • tests are written first, before the code. (I mean the actual test cases, not just test plans.) Tests written after the fact may mean fewer bugs, but programmers will think the tests slow them down. Under schedule pressure, the tests will go by the wayside.

    • all tests must be automated, to give programmers a safety net that lets them make changes with confidence. Programmers will need to run these tests easily and often. (For example, I run all my tests multiple times an hour by hitting F10.) Ease and speed have implications for test tooling and program design.

  2. Testing must be tightly integrated with the act of programming. What works well is a rapid test-code cycle, where a programmer implements one or a few tests, writes the code to make those tests pass, then proceeds to the next test.

  3. Peer pressure. Even with the above, testing requires discipline. It's easy to yield to temptation and skimp on testing. For example, under schedule pressure, programmers might do the minimum the rules require, rather than what the problem demands. When a test suite is weakened in this way, it becomes less useful. That means it's easier to yield to temptation next time - and you're in a downward spiral. To avoid that, yieldings need to be exposed to the light. XP's pair programming is one good example of that - if everyone programs in pairs, each person keeps the other on the straight and narrow.

Implications for teaching:

  • training should emphasize micro-iteration, test-first programming with fully automated tests.

  • training should be actively designed to encourage peer pressure.

=== An efficient and effective course

My experience is that unit-testing training should emphasize practice over concepts. That is, learning to test is more like learning to ride a bike than learning an algorithm for finding square roots. You have to do it, not just hear about it.

Within the past year, I've shifted my programmer testing training toward practice. I give small lectures, but spend most of my time sitting with groups of two or three programmers, taking up to half a day to add code test-first to their application. I'd say about half the people "get it". That means they want to try it because they believe it would help, and they believe they could ride the bike - perhaps they'd be a little wobbly, but they're ready to proceed.

A 50% success rate seems low, but my goal is to have a core of opinion leaders doing test-first. As they become successful, the practice will spread. I am no doubt prejudiced, but the other 50% are generally not the kind of people that spread new practices anyway.

But at =Anonycorp= we want to train a lot of people, and we want to train them fast - no time for understanding to radiate out from a core group of opinion leaders.

Here's how I envision a class. It would be three days long. It would be no larger than 15 people. (10 would be better.) There would be two or three "alumni" returning from previous classes to help out in exercises. (The alumni would be different for each class.)

There would be ordinary lectures on two topics:

  • unit testing in general
  • those parts of =Anonycorp='s testing strategy that apply to programmers.

These would be followed by a test-first demo, where the instructor adds a feature test-first to some code.

But the bulk of the class would be exercises in which groups of 2-4 people do realistic work. That work would be of two kinds:

  1. adding a new feature to some code.

  2. cleaning up code ("refactoring") while simultaneously creating a test suite that makes adding features later both safer and faster.

In both cases, the code would come from one of your products.

Both types of exercises would be lengthy (hours) and repeated once. After each exercise, there would be a class discussion of what they did. Groups would mix up between exercises.

The class would end with a discussion of what people expect next. What hurdles will they have to overcome? How do they plan to work on them?

We'll be explicit that our goal is not just teaching techniques, but team building. When they have questions about testing after the class, they should feel free to talk to other students. (It's probably best if most of a class is from a single team or product area, so talking will come naturally.)

After about a month, there'll be a two hour "class reunion", facilitated by the instructor, where people can take stock (and the instructor can learn to improve the course).

Moreover, during the class the instructor will have kept an eye out for natural enthusiasts and opinion leaders. After the class, those people will be cultivated. The instructor will wander by and see how things are going - and not forget to ask them if they'll help out as alumni in later classes.

== Notes

  1. This is still weak on peer pressure. We should discuss what would work well within =Anonycorp=.

  2. Four multiple-hour exercises + discussion is a lot for a three day class.

  3. ...

  4. All this post-class interaction doesn't mesh well with remote sites.

## Posted at 11:05 in category /agile [permalink] [top]

About Brian Marick
I consult mainly on Agile software development, with a special focus on how testing fits in.

Contact me here: marick@exampler.com.

 

Syndication

 

Agile Testing Directions
Introduction
Tests and examples
Technology-facing programmer support
Business-facing team support
Business-facing product critiques
Technology-facing product critiques
Testers on agile projects
Postscript

Permalink to this list

 

Working your way out of the automated GUI testing tarpit
  1. Three ways of writing the same test
  2. A test should deduce its setup path
  3. Convert the suite one failure at a time
  4. You should be able to get to any page in one step
  5. Extract fast tests about single pages
  6. Link checking without clicking on links
  7. Workflow tests remain GUI tests
Permalink to this list

 

Design-Driven Test-Driven Design
Creating a test
Making it (barely) run
Views and presenters appear
Hooking up the real GUI

 

Popular Articles
A roadmap for testing on an agile project: When consulting on testing in Agile projects, I like to call this plan "what I'm biased toward."

Tacit knowledge: Experts often have no theory of their work. They simply perform skillfully.

Process and personality: Every article on methodology implicitly begins "Let's talk about me."

 

Related Weblogs

Wayne Allen
James Bach
Laurent Bossavit
William Caputo
Mike Clark
Rachel Davies
Esther Derby
Michael Feathers
Developer Testing
Chad Fowler
Martin Fowler
Alan Francis
Elisabeth Hendrickson
Grig Gheorghiu
Andy Hunt
Ben Hyde
Ron Jeffries
Jonathan Kohl
Dave Liebreich
Jeff Patton
Bret Pettichord
Hiring Johanna Rothman
Managing Johanna Rothman
Kevin Rutherford
Christian Sepulveda
James Shore
Jeff Sutherland
Pragmatic Dave Thomas
Glenn Vanderburg
Greg Vaughn
Eugene Wallingford
Jim Weirich

 

Where to Find Me


Software Practice Advancement

 

Archives
All of 2006
All of 2005
All of 2004
All of 2003

 

Join!

Agile Alliance Logo