Exploration Through Example

Example-driven development, Agile testing, context-driven testing, Agile programming, Ruby, and other things of interest to Brian Marick
191.8 167.2 186.2 183.6 184.0 183.2 184.6

Sat, 28 Jan 2006

Working your way out of the automated GUI testing tarpit (part 7)

part 1, part 2, part 3, part 4 part 5, part 6

Where do we stand?

  • I've prototyped a strategy for gradual transformation of slow, fragile, hard-to-read tests into fast tests that use no unnecessary words and are therefore less fragile.

  • Every page has tests of its layout. They may be comprehensive or not, depending on the needs of the project.

  • Sometimes, parts of pages are generated dynamically according to "presentation business rules". For example, a rule might govern whether a particular button appeears. The page tests can (should) include tests of each of those rules.

  • The page tests are not mocked up. That is, the page renderer is not invoked with an artificially constructed app state. Instead, the app state is constructed through a series of actions on the app.

  • Nevertheless, the tests are declarative in the sense that they do not describe how a user of the app would navigate to the page in question. Instead, the test figures navigation out for itself.

  • As the app grows a presentation layer, the page tests can run at unit test speeds by calling directly into it.

  • There are no tests that check for dead links by following them. Instead, the output-generator and input-handler cooperate so that any attempt to generate a bad link will lead to an immediate failure. Therefore, dead links can be discovered by rendering all parts of all pages. The page tests do that, so they are also link-checking tests.

I want to end this series by closing one important gap. We know that links go somewhere, but we don't know that they go to the right place, the place where the user can continue her task.

We could test that each link destination is as expected. But if following links is all about doing tasks, good link tests follow links along a path that demonstrates how a user would do her work. They are workflow tests or use-case tests. They are, in fact, the kind of design tests that Jeff Patton and I thought would be a communication tool between user experience designers and programmers. (At this point, you should wonder about hammers and nails.)

Here's a workflow test that shows a doctor entering a new case into the system.

 def test_normal_new_patient_workflow
    @dr_dawn.logs_in
                (@dr_dawn.should_be_on_the_main_page)
    @dr_dawn.adds_a_case
                (@dr_dawn.should_be_on_the_case_display_page)
    @dr_dawn.adds_a_visit
              (@dr_dawn.should_be_back_on_the_case_display_page)
 end

I've written that with unusual messages, formatted oddly. Why?

Unlike this test, I think my final declarative tests really are unit tests. According to my definition, unit tests are ones written in the language of the implementation rather than the language of the business. My declarative tests are about what appears on a page and when, not about cases and cows and audits. They're unit tests, so I don't mind that they look geeky.

Workflow tests, however, are quintessential business-facing tests: they're all about asserting that the app allows a doctor to perform a key business task. So I'm trying to write them such that, punctuational peculiarities aside, they're sentences someone calling the support desk might speak. I do that not so much because I expect a user to look at them as because I want my perspective while writing them to be outward-focused. That way, I'll stumble across more design omissions. Sending all messages to an object representing a a persona (dr_dawn) also helps me look outward from the code.

Similarly, I'm using layout to emphasize what's most important. That's what the user can do and what, having done that, she can now do next. The actual checks that the action has landed her on the right page are less important—parenthetical—so I place them to the side. (Note also the nod to behavior-driven design.)

The methods that move around (like adds_a_case) talk to the browser in the same way that the earlier abstracted procedural tests do, and the parenthetical comments turn into assertions:

   def should_be_on_the_main_page
      assert_page_title_matches(/Cases Available to Dmorin/)
   end

As you can see, I don't check much about the page. I leave that to the declarative page tests.

That's it. I believe I have a strategy for transforming a tarpit of UI tests into (1) a small number of workflow tests that still go through the UI and (2) a larger number of unit tests of everything else.

Thanks for reading this far (supposing anyone has).

What's missing?

The tests I was transforming didn't do any checking of pure business logic, but in real life they probably would. They could be rewritten in the same way, though I'd prefer to have at least some such tests go below the presentation layer.

There are no browser compatibility tests. If the compatibility testing strategy is to run all the UI tests against different browsers, the transformation I advocate might well weaken it.

There are no tests of the Back button. Should they be part of workflow tests? Specialized? I don't know enough about how a well-behaved program deals with Back to speculate just now. (Hat tip to Seaside here (PDF).)

Can you do all this?

The transformation into unit tests depends on there being one place that receives HTTP requests (Webrick). Since Webrick is initialized in a single place, it's easy to find all the places that need to be changed to add a test-support feature. The same was true on the outgoing side, since there was a single renderer to make XHTML. So this isn't the holy grail—a test improvement strategy that can work with any old product code. Legacy desktop applications that have GUI code scattered everywhere are still going to be a mess.

See the code for complete details.

## Posted at 07:56 in category /testing [permalink] [top]

About Brian Marick
I consult mainly on Agile software development, with a special focus on how testing fits in.

Contact me here: marick@exampler.com.

 

Syndication

 

Agile Testing Directions
Introduction
Tests and examples
Technology-facing programmer support
Business-facing team support
Business-facing product critiques
Technology-facing product critiques
Testers on agile projects
Postscript

Permalink to this list

 

Working your way out of the automated GUI testing tarpit
  1. Three ways of writing the same test
  2. A test should deduce its setup path
  3. Convert the suite one failure at a time
  4. You should be able to get to any page in one step
  5. Extract fast tests about single pages
  6. Link checking without clicking on links
  7. Workflow tests remain GUI tests
Permalink to this list

 

Design-Driven Test-Driven Design
Creating a test
Making it (barely) run
Views and presenters appear
Hooking up the real GUI

 

Popular Articles
A roadmap for testing on an agile project: When consulting on testing in Agile projects, I like to call this plan "what I'm biased toward."

Tacit knowledge: Experts often have no theory of their work. They simply perform skillfully.

Process and personality: Every article on methodology implicitly begins "Let's talk about me."

 

Related Weblogs

Wayne Allen
James Bach
Laurent Bossavit
William Caputo
Mike Clark
Rachel Davies
Esther Derby
Michael Feathers
Developer Testing
Chad Fowler
Martin Fowler
Alan Francis
Elisabeth Hendrickson
Grig Gheorghiu
Andy Hunt
Ben Hyde
Ron Jeffries
Jonathan Kohl
Dave Liebreich
Jeff Patton
Bret Pettichord
Hiring Johanna Rothman
Managing Johanna Rothman
Kevin Rutherford
Christian Sepulveda
James Shore
Jeff Sutherland
Pragmatic Dave Thomas
Glenn Vanderburg
Greg Vaughn
Eugene Wallingford
Jim Weirich

 

Where to Find Me


Software Practice Advancement

 

Archives
All of 2006
All of 2005
All of 2004
All of 2003

 

Join!

Agile Alliance Logo