Dr. Jody Paul
jody@acm.org
Structured Walkthroughs
and Formal Technical Reviews

Table of Contents



References

  • Freedman and Weinberg. Handbook of Walkthroughs, Inspections, and Technical Reviews. Chicago: Scott, Foresman and Co., 1983.
  • Gause and Weinberg. Exploring Requirements. New York: Dorset House, 1989. (Chapter 20)
  • Weinberg. The Psychology of Computer Programming. New York: Van Nostrand, 1971.
  • Yourdon. Managing the Structured Techniques. Englewood Cliffs: Yourdon Press, 1989.
  • Pressman. Software Engineering: A Beginner's Guide. New York: McGraw-Hill, 1988. (Chapters 4, 5 & Appendix B)
  • Pressman. Software Engineering: A Practitioner's Approach. New York: McGraw-Hill, 1992.
TOC

Walkthrough Objectives

"The number of errors in production systems decreases by as much as 90% in organizations that use walkthroughs diligently."
  • Major objective: FIND ERRORS
    • Uncover errors in function, logic, or implementation for any representation of the software
  • Look for weaknesses or errors of style
  • Verify that the software meets its requirements
  • Ensure that the software has been represented according to predefined standards
  • Make projects more manageable
    • Achieve software that is developed in a uniform manner

Side Benefits

"For a junior programmer, working for one year as part of a team that practices walkthroughs is equivalent to two years of working alone."
  • Consciousness-raising
    • Senior people get new ideas and insights from junior people
  • Enables observation of different approaches to software analysis, design, and implementation
  • "Peer pressure" improves quality
  • Promotes backup and continuity
    • Reduces risk of discontinuity & "useless code" since several people become familiar with parts of software that they may not have otherwise seen
TOC

Scheduling Walkthroughs

  • Walkthroughs should be conducted frequently
    • Focuses on a specific and small piece of work
    • Increases the likelihood of uncovering errors
    • Before author has too great an ego investment
  • Scheduled only when the author is ready
  • About 4 or 5 people
  • Advanced preparation (no more than 2 hours) should be required of and performed by each reviewer
TOC

Roles

  • Coordinator (Review Leader)
  • Author (Producer)
  • Reviewers
  • Recorder

Conducting Walkthroughs

  • Coordinator chairs the meeting
  • Walkthrough structure
    • Author's overview?
      • Reviewers should be able to understand the product without assistance
      • Author's overview may "brainwash" reviewers into making the same logical errors as did the author
    • Author's detailed walkthrough
      • Based on logical arguments of what the design or code will do at various stages
    • Requested specific test cases
  • Coordinator resolves disagreements when the team can not reach a consensus

Recordkeeping

  • Review issues list
    • Identify problem areas within the product
    • Action Item checklist for corrections to be made
  • Review summary report
    • What was reviewed?
    • Who reviewed it?
    • What were the findings and conclusions?

Results

  • At the end of the review, all attendees must decide whether to
    • Accept the product without further modification
    • Reject the product due to severe errors
    • Accept the product provisionally
  • All attendees complete a sign-off, indicating...
    • Their participation in the review
    • Concurrence with the review team's findings
TOC

Walkthroughs: Observations

    Keep walkthroughs short (less than 90 minutes)
    Keep good notes during the walkthroughs

    Emphasize the following:

      Error detection, not error correction
      Everyone is responsible for any bugs remaining after the walkthrough
      The product, not the person, is being reviewed

Walkthroughs: Guidelines

  • Set an agenda and keep to it
  • Limit debate and rebuttal
  • Identify problem areas, but don't attempt to solve every problem
  • Limit the number of participants
  • Insist upon advance preparation
  • Train reviewers
  • Develop a checklist for each reviewable product
TOC

Types of Walkthroughs

  • Specification walkthroughs
    • System specification
    • Project planning
    • Requirements analysis
  • Design walkthroughs
    • Preliminary design
    • Design
  • Code walkthroughs
  • Test walkthroughs
    • Test plan
    • Test procedure
  • Maintenance reviews

Specification Walkthroughs

  • Objective - Check the system specification for:
    • Problems
    • Inaccuracies
    • Ambiguities
    • Omissions
  • Participants
    • User
    • Senior analyst
    • Project analysts
  • Objects
    • DFDs, Data Dictionary, ERDs, ...

Design Walkthroughs

  • Objective - Check the architecture of the design for:
    • Flaws
    • Weaknesses
    • Errors
    • Omissions
  • Participants
    • User
    • Analyst
    • Senior designer
    • Project designers
  • Objects
    • Structure charts, detailed design documents, ...

Code Walkthroughs

  • Objective - Check the code for:
    • Errors
    • Standards violations
    • Lack of clarity
    • Inconsistency
  • Participants
    • Author
    • Project programmers
    • Designer
    • Outside programmers
  • Objects
    • Code listing, compiler listings, ...

Test Walkthroughs

  • Objective - Check the testing documents for:
    • Inadequacy
    • Incompleteness
    • Lack of clarity
  • Participants
    • Project programmers
    • Tester
    • Analyst
    • Designer
  • Objects
    • Test plan, test procedures, sample test data, ...
TOC

Checklists

Checklist: System Specification

  • Are major functions defined in a bounded and unambiguous fashion?
  • Are interfaces between system elements defined?
  • Have performance bounds been established for the system as a whole and for each element?
  • Are design constraints established for each element?
  • Has the best alternative been selected?
  • Is the solution technologically feasible?
  • Has a mechanism for system validation and verification been established?
  • Is there consistency among all system elements?

Checklist: Requirements Analysis

  • Is information domain analysis complete, consistent, and accurate?
  • Is problem partitioning complete?
  • Are external and internal interfaces properly defined?
  • Does the data model properly reflect data objects, their attributes, and relationships?
  • Are all requirements traceable to system level?
  • Has prototyping been conducted for the user/customer?
  • Is performance achievable within the constraints imposed by other system elements?
  • Are requirements consistent with schedule, resources, and budget?
  • Are validation criteria complete?

Checklist: Preliminary Design

  • Are software requirements reflected in the software architecture?
  • Is effective modularity achieved? Are modules functionally independent?
  • Is the program architecture factored?
  • Are interfaces defined for modules and external system elements?
  • Is the data structure consistent with the information domain?
  • Is the data structure consistent with software requirements?
  • Has maintainability been considered?
  • Have quality factors been explicitly assessed?

Checklist: Design

  • Does the algorithm accomplish the desired function?
  • Is the algorithm logically correct?
  • Is the interface consistent with the architectural design?
  • Is the logical complexity reasonable?
  • Have error handling and "antibugging" been specified?
  • Are local data structures properly defined?
  • Are structured programming constructs used throughout?
  • Is design detail amenable to implementation language?
  • Which operating system or language-dependent features are used?
  • Is compound or inverse logic used?
  • Has maintainability been considered?

Checklist: Code

  • Has the design properly been translated into code?
  • Are there misspellings and typos?
  • Does the code adhere to proper use of language conventions?
  • Is there compliance with coding standards for language style, comments, prologues, ...?
  • Are there incorrect or ambiguous comments?
  • Are data types and data declarations proper?
  • Are physical constants correct?
  • Have all the items on the design walkthrough checklist been reapplied as required?

Checklist: Test Plan

  • Have major test phases properly been identified and sequenced?
  • Has traceability to validation criteria and requirements been established?
  • Are major functions demonstrated early? (top-down)
  • Is the test plan consistent with the overall project plan?
  • Has a test schedule been explicitly defined?
  • Are test resources and tools identified and available?
  • Has a test record-keeping mechanism been established?
  • Have test stubs been identified and has work to develop them been scheduled?
  • Has stress testing for the software been specified?
  • Has a regression testing mechanism been established?

Checklist: Test Procedure

  • Have both white and black box test been specified?
  • Have all independent logic paths been tested?
  • Have test cases been identified and listed with their expected results?
  • Is error handling being tested?
  • Are boundary values being tested?
  • Are timing and performance being tested?
  • Has an acceptable variation form the expected results been specified?

Checklist: Maintenance

  • Have side effects associated with the change been considered?
  • Has the request for change been documented, evaluated, and approved?
  • Has the change, once made, been documented and reported to all interested parties?
  • Have appropriate walkthroughs been conducted?
  • Has a final acceptance review been conducted to ensure that all software has been properly updated, tested, and replaced?
TOC

Structured Walkthroughs

  • Objective: FIND ERRORS
  • Focus: the product, not the author
  • Improves software quality
  • Reduces risks of discontinuity
  • Provides training for junior personnel
  • Time-effective and cost-effective
TOC

RETURN Dr. Jody Paul's Home Page





©1995-1998,2006 Dr. Jody Paul