Further experiences with scenarios and checklists

J. Miller, M. Roper, M. Wood

Research output: Contribution to journalArticle

74 Citations (Scopus)

Abstract

Software inspection is one of the best methods of verifying software documents. Software inspection is a complex process, with many possible variations, most of which have received little or no evaluation. This paper reports on the evaluation of one component of the inspection process, detection aids, specifically using Scenario or Checklist approaches. The evaluation is by subject-based experimentation, and is currently one of three independent experiments on the same hypothesis. The paper describes the experimental process, the resulting analysis of the experimental data, and attempts to compare the results in this experiment with the other experiments. This replication is broadly supportive of the results from the original experiment, namely, that the Scenario approach is superior to the Checklist approach; and that the meeting component of a software inspection is not an effective defect detection mechanism. This experiment also tentatively proposes additional relationships between general academic performance and individual inspection performance; and between meeting loss and group inspection performance.
LanguageEnglish
Pages37-64
Number of pages27
JournalEmpirical Software Engineering
Volume3
Issue number1
DOIs
Publication statusPublished - 1998

Fingerprint

Inspection
Experiments

Keywords

  • software inspection
  • defect detection aids
  • partial replication
  • experiments

Cite this

@article{f015effbf2744a0696d909bfda6d9953,
title = "Further experiences with scenarios and checklists",
abstract = "Software inspection is one of the best methods of verifying software documents. Software inspection is a complex process, with many possible variations, most of which have received little or no evaluation. This paper reports on the evaluation of one component of the inspection process, detection aids, specifically using Scenario or Checklist approaches. The evaluation is by subject-based experimentation, and is currently one of three independent experiments on the same hypothesis. The paper describes the experimental process, the resulting analysis of the experimental data, and attempts to compare the results in this experiment with the other experiments. This replication is broadly supportive of the results from the original experiment, namely, that the Scenario approach is superior to the Checklist approach; and that the meeting component of a software inspection is not an effective defect detection mechanism. This experiment also tentatively proposes additional relationships between general academic performance and individual inspection performance; and between meeting loss and group inspection performance.",
keywords = "software inspection, defect detection aids, partial replication, experiments",
author = "J. Miller and M. Roper and M. Wood",
year = "1998",
doi = "10.1023/A:1009735805377",
language = "English",
volume = "3",
pages = "37--64",
journal = "Empirical Software Engineering",
issn = "1382-3256",
number = "1",

}

Further experiences with scenarios and checklists. / Miller, J.; Roper, M.; Wood, M.

In: Empirical Software Engineering, Vol. 3, No. 1, 1998, p. 37-64.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Further experiences with scenarios and checklists

AU - Miller, J.

AU - Roper, M.

AU - Wood, M.

PY - 1998

Y1 - 1998

N2 - Software inspection is one of the best methods of verifying software documents. Software inspection is a complex process, with many possible variations, most of which have received little or no evaluation. This paper reports on the evaluation of one component of the inspection process, detection aids, specifically using Scenario or Checklist approaches. The evaluation is by subject-based experimentation, and is currently one of three independent experiments on the same hypothesis. The paper describes the experimental process, the resulting analysis of the experimental data, and attempts to compare the results in this experiment with the other experiments. This replication is broadly supportive of the results from the original experiment, namely, that the Scenario approach is superior to the Checklist approach; and that the meeting component of a software inspection is not an effective defect detection mechanism. This experiment also tentatively proposes additional relationships between general academic performance and individual inspection performance; and between meeting loss and group inspection performance.

AB - Software inspection is one of the best methods of verifying software documents. Software inspection is a complex process, with many possible variations, most of which have received little or no evaluation. This paper reports on the evaluation of one component of the inspection process, detection aids, specifically using Scenario or Checklist approaches. The evaluation is by subject-based experimentation, and is currently one of three independent experiments on the same hypothesis. The paper describes the experimental process, the resulting analysis of the experimental data, and attempts to compare the results in this experiment with the other experiments. This replication is broadly supportive of the results from the original experiment, namely, that the Scenario approach is superior to the Checklist approach; and that the meeting component of a software inspection is not an effective defect detection mechanism. This experiment also tentatively proposes additional relationships between general academic performance and individual inspection performance; and between meeting loss and group inspection performance.

KW - software inspection

KW - defect detection aids

KW - partial replication

KW - experiments

UR - http://portal.acm.org/citation.cfm?id=594421&dl=ACM&coll=GUIDE

UR - http://dx.doi.org/10.1023/A:1009735805377

U2 - 10.1023/A:1009735805377

DO - 10.1023/A:1009735805377

M3 - Article

VL - 3

SP - 37

EP - 64

JO - Empirical Software Engineering

T2 - Empirical Software Engineering

JF - Empirical Software Engineering

SN - 1382-3256

IS - 1

ER -