exampler.com/testing-com > Writings > Reviews > Software Inspection

Testing Foundations
Consulting in Software Testing
Brian Marick

Services

Writings

Tools

Agile Testing

Software Inspection

by Tom Gilb and Dorothy Graham
(Microsoft Press, 1993, ISBN 0-201-63181-4)
reviewed by Brian Marick on October 31, 1996
Buy this book from fatbrain.com technical bookstore

From the preface:
"The word 'inspect' is an ordinary English verb meaning 'to look at or examine.' Software Inspection (with a capital 'I'), as described in this book, is an extra-ordinary technique which has been proved successful again and again - in so far as it is properly applied...

"There was (until now) no definitive book which described the Inspection process clearly and in its most advanced, complete and productive form.

"The authors have extensive experience in many and varied software engineering quality improvement techniques, and in particular in Inspections, and a particular feature of this book is the numerous small tricks, insights and practical observations gathered since we began to spread this method to our international clients in 1975."


Dave Gelperin said something at a conference that struck me as utterly profound. It went roughly like this: "People don't like Inspections. They never have. The fact that Inspections have survived in the face of universal dislike for over two decades must be proof of their value." Something so disliked, and yet so valued, is worth knowing about.

Inspections (with the capital 'I') are at one end of a continuum of formality and public accountability. At the other end is you, alone, double-checking your work. Slightly more formal is "buddy checking", where you and a friend check your work together. More formal is a group of people who meet to review work. Such reviews often produce formal reports in which the group takes collective responsibility for the work. Finally, there are Inspections, which have several variants. All share these characteristics:

  1. There is a group of "checkers" who individually check the work alone at their desks. Often, the checkers will have specific roles (check hardware interfaces, check for standards compliance, etc.) The "work", by the way, may be anything: code, user documentation, requirements, etc.

  2. The checkers gather, together with a moderator and a recorder, in a logging meeting. The checkers report potential problems ("issues"), which the recorder records on the board. The moderator keeps the meeting on track.

  3. The logging meeting is only to report issues. The moderator squelches any attempt to resolve them.

  4. During the logging meeting, checkers are expected to discover and report new issues. The synergy of the meeting encourages them; without it, they might as well email their issues to the recorder.

  5. The original author (or someone else) resolves the issues after the inspection. There is some degree of double-checking of that work, ranging from buddy checking by the inspection leader to a full-blown re-inspection.

Gilb and Graham augment this common practice with a strong and valuable emphasis on feeding inspection data into process improvement.


This book has a rare quality. It is complete and consistent. It is "whole" - not sketches of a solution, not leaving the "obvious" steps to be supplied by the reader, not very dependent on the reader's judgement at all. In fact, withholding judgement may be the best way to read it. More than once, I found myself saying, "That can't possibly work", only to realize later that it does in fact work - partly because of something explained later, partly because I had to make sure I thought about software development in their terms. (The book is somewhat leadenly written, which makes key assumptions easier to miss.)

You will like this book (and stand a better chance of adopting their Inspection process) if you have these characteristics:

You must be obsessed with the efficiency of micro-tasks.

Take these quotes from page 192:

This emphasis is reminiscent of nothing so much as the time and motion studies of Taylorism. ("If we position the bricklayer at this height relative to the new row of bricks, and the brick supply and mortar at this height, and provide one runner per twelve bricklayers, we will optimize bricklaying efficiency.") They differ from Taylorism in that the optimizing procedures are only initially given by the experts from on high; thereafter, they're tailored through metrics kept by, and improvement suggestions made by, the workers themselves.

You must be comfortable with a generative, step-by-step software production process.

Their description of software development is reminiscent of transformational grammars in linguistics. You know the drill: you start by deciding what type of sentence you want, perhaps a noun phrase followed by a verb phrase. You can then decide what type of noun phrase you want, perhaps a noun followed by a prepositional phrase. You can then... (Apologies to linguists out there if I've botched the details.) According to Gilb and Graham, software development is similarly rule-driven:

  1. At every stage, you begin with a source document, then apply accepted procedures to create a derived document, making errors along the way.
  2. You check the derived document and source document to see if generation followed the rules. (You may also find issues with the source document, cases where it does not match the rules for its creation. Its inspection missed some issues, which is to be expected.)
  3. When the derived document is corrected, it is now suitable for use as a source document.
  4. Continue until you've completed all documents. Deliver the product.

You must believe that complete and sufficient rules and procedures can be captured and written down, and that maintaining consistency between documents is appropriate.

You must work in a stable organization with widespread respect, a decent minimum level of competence, and good communication.

If you do not work in such an organization, complete and sufficient rules can't be captured. Why do I say this? Page 426 gives rules for code, one of which is:

How could such rules be useful? Surely they are far too vague? They won't lead to pitched battles about relative complexities of commentary and code during the logging meeting (because the moderator won't allow it), but surely the battles will start just outside the door?

Not if you have widespread respect. An essential feature of their Inspection process is that the person editing the document after the Inspection must address all issues. So if anyone thinks a comment is too terse, it has to be dealt with. Perhaps it will be rewritten, perhaps a glossary will be added, perhaps a request will be made to change upstream design documentation, whatever. The key notion is that the checker is always right. As they say on p. 224:

"The one thing you [the editor] are not allowed to do is to ignore any logged issue. A logged issue is not necessarily an 'error' on your part. It is, however, proof that the real organization out there did have, and thus will probably continue to have, some sort of problem in the future unless you act to prevent it now."

This requires respect and minimum competence (you can't think the checker is a bozo - and the checker must not in fact be a bozo). It requires communication, because you have to seek out the checker afterwards to discuss the issue. If you have these things, there will emerge a concensus about what terms like "complete" and "relevant" mean. If the organization is stable (low turnover), this concensus can be preserved.

If your organization doesn't have these properties, the process is likely to devolve until only nit-picking, bookkeeping, undebatable issues are raised, the sort shown on page 55:

(Either the authors picked a poor example - all these issues are found by mechanical consistency checks - or I've completely missed the point of the book.) Further, the rules themselves are likely to become exclusively bureaucratic, like this subset from page 425:


For all its completeness, I have two gripes with this book:

  1. There's a famous cartoon by Sidney Harris (I think). Two people in lab coats are standing at a blackboard covered with equations. One points to the phrase "then a miracle happens..." between the next-to-last and last equations and says, "I think you need to be more explicit in this step."

    The step that needs work in the Inspection process is finding the problems. The complaint I so often hear about inspection courses and books goes like this: "I understand what to do in meetings. I understand what everyone's role is. I understand how I'm to report issues. But when I'm sitting alone, checking a document, I don't know what to do."

    In code inspections, providing a rule (as Gilb and Graham do) that says "The document (code) must meet all relevant project requirements" is pretty much equivalent to saying "Then a miracle happens". How do you find the deviations?

    Gilb and Graham allow for checklists that expand on the rules. However, they have a fierce aversion to checklists more than a page long (as a reaction to creeping bureaucracy). I believe everyone uses checklists longer than a page when checking, but those checklists are kept in their heads: "is there an off-by-one error in this relational expression?", "this file is opened here - is it closed in all cases hereafter, including error cases?", "what happens if the file being opened doesn't exist?" and so on. Good checklists are key to good detection of issues, and their creation deserves more attention.
  2. The discussion of metrics and statistics is shallow, as is typical of software engineering books. The ghost of Tukey, author of Exploratory Data Analysis and other wise books about avoiding misuses of statistics, must cringe in its grave every time a software person starts to plug numbers into a spreadsheet. I'm sure the authors know how to use statistics wisely, but they're in the position of recommending a blowtorch to unfreeze pipes: many people will do just fine, but some will burn their house down.

This book will help you with the following tasks

Click on the task name to see other works that address the same task.

Inspecting or Reviewing
A basic text, best for people with a rather well-controlled, step-by-step software development process.

This book's table of contents

  1. The Historical Background of Inspection and Comparison with Other Methods
  2. The Benefits and Costs of Inspection
  3. Overview of Software Inspection
  4. The Inspection Process (part 1) - Initiation and Documents
  5. The Inspection Process (part 2) - Checking
  6. The Inspection Process (part 3) - Completion
  7. The Inspection Process (part 4) - Process Improvement
  8. The Inspection Leader
  9. The Inspection Experience from Specialist Viewpoints
  10. Installation and Training
  11. Overcoming the Difficulties
  12. Software Inspections at Applicon
  13. One Person Getting Started (Cray Research)
  14. Implementing Document Inspection on an Unusually Wide Basis at an Electronics Manufacturer (Thorn EMI)
  15. Inspection at Racal Redac
  16. Inspection in the Sema Group (UK)
  17. Practical Aspects of the Defect Prevention Process (IBM)

Services

Writings

Tools

Agile Testing

[an error occurred while processing this directive]