Peer Reviews in Software: A Practical Guide

Karl E. Wiegers  
Total pages
October 2001
Related Titles

Product detail

Product Price CHF Available  
Peer Reviews in Software: A Practical Guide
82.60 not defined

Table of Contents


My Objectives.

Intended Audience.

Reading Suggestions.


1. The Quality Challenge.

Looking Over the Shoulder.

Quality Isn't Quite Free.

Justifying Peer Reviews.

Peer Reviews, Testing, and Quality Tools.

What Can Be Reviewed.

A Personal Commitment to Quality.

2. A Little Help from Your Friends.

Scratch Each Other's Back.

Reviews and Team Culture.

The Influence of Culture.

Reviews and Managers.

Why People Don't Do Reviews.

Overcoming Resistance to Reviews.

Peer Review Sophistication Scale.

Planning for Reviews.

Guiding Principles for Reviews.

3. Peer Review Formality Spectrum.

The Formality Spectrum.


Team Review.


Pair Programming.

Peer Deskcheck.


Ad Hoc Review.

Choosing a Review Approach.

4. The Inspection Process.

Inspector Roles.

The Author's Role.

To Read or Not To Read.

Inspection Team Size.

Inspection Process Stages.







Causal Analysis.

Variations on the Inspection Theme.

Gilb/Graham Method.

High-Impact Inspection.

Phased Inspections.

5. Planning the Inspection.

When to Hold Inspections.

The Inspection Moderator.

Selecting the Material.

Inspection Entry Criteria.

Assembling the Cast.

Inspector Perspectives.

Managers and Observers.

The Inspection Package.

Inspection Rates.

Scheduling Inspection Events.

6. Examining the Work Product.

The Overview Stage.

The Preparation Stage.

Preparation Approaches.

Defect Checklists.

Other Analysis Techniques.

7. Putting Your Heads Together.

The Moderator's Role.

Launching the Meeting.

Conducting the Meeting.

Reading the Work Product.

Raising Defects and Issues.

Recording Defects and Issues.

Watching for Problems.

Product Appraisal.

Closing the Meeting.

Improving the Inspection Process.

8. Bringing Closure.

The Rework Stage.

The Follow-Up Stage.

The Causal Analysis Stage.

Inspection Exit Criteria.

9. Analyzing Inspection Data.

Why Collect Data?

Some Measurement Caveats.

Basic Data Items and Metrics.

The Inspection Database.

Data Analysis.

Measuring the Impact of Inspections.



Return on Investment.

10. Installing a Peer Review Program.

The Peer Review Process Owner.

Preparing the Organization.

Process Assets.

The Peer Review Coordinator.

Peer Review Training.

Piloting the Review Process.

11. Making Peer Reviews Work for You.

Critical Success Factors.

Review Traps to Avoid.

Troubleshooting Review Problems.

12. Special Review Challenges.

Large Work Products.

Geographical or Time Separation.

Distributed Review Meeting.

Asynchronous Review.

Generated and Nonprocedural Code.

Too Many Participants.

No Qualified Reviewers Available.


Appendix A: Peer Reviews and Process Improvement Models.

Capability Maturity Model for Software.

Goals of the Peer Reviews Key Process Area.

Activities Performed.

Commitment to Perform.

Ability to Perform.

Measurement and Analysis.

Verifying Implementation.

Systems Engineering Capability Maturity Model.


Prepare for Peer Reviews.

Conduct Peer Reviews.

Analyze Peer Review Data.

ISO 9000-3.

Appendix B: Supplemental Materials.

Work Aids.

Other Peer Review Resources.



Back Cover

'I will tell my friends and other folks in quality assurance and process management roles to RUN (don't walk) and buy Peer Reviews in Software. In fact, my organization could use this book RIGHT NOW.' —Brad Appleton

Karl's writing is nicely motivational, reasonably detailed, and covers the range of issues that are important.'—Mark Paulk

There is nothing wrong with making mistakes; it is part of what makes us human. Catching the errors early, however, before they become difficult to find and expensive to correct, is very important. A peer review program is a vital component of any quality software development effort, yet too few software professionals have had the experience or training necessary to implement peer reviews successfully.

Concise, readable, and pragmatic, Peer Reviews in Software walks you through the peer review process and gives you the specific methods and techniques you need to help ensure a quality software release. Comprehensively covering both formal and informal processes, the book describes various peer review methods and offers advice on their appropriate use under a variety of circumstances.

This book focuses on—but is not limited to—the technique of inspection. This is the most formal, rigorous, and effective type of peer review. The various stages of inspection—including planning, individual preparation, conducting inspection meetings, and follow-up—are discussed in detail. In addition, Peer Reviews in Software explores the cultural and social nuances involved in critiquing the work of others, and reveals

Specific topics include:

  • Overcoming resistance to reviews
  • Inspection teams and roles
  • Inspection process stages
  • Scheduling inspection events
  • Analyzing inspection data
  • Peer review training
  • Critical success factors and pitfalls
  • Relating peer reviews to process improvement models

Karl Wiegers closes with a look at special review challenges, including peer review of large work products and geographically dispersed development teams. He provides many practical resources to help you jump-start your review program, enhance communications on your projects, and ultimately ship high-quality software on schedule.



Karl E. Wiegers, Ph.D., is Principal Consultant with Process Impact, a software process consulting and education company. He previously spent eighteen years at Eastman Kodak Company, where he held positions as a software applications developer, software manager, and software process and quality improvement leader. Karl has been participating in and leading software peer reviews throughout his extensive career.