Skip to main content
Log in

A Replicated Experiment to Assess Requirements Inspection Techniques

  • Published:
Empirical Software Engineering Aims and scope Submit manuscript

Abstract

This paper presents the independent replication of a controlled experiment which compared three defect detection techniques (Ad Hoc, Checklist, and Defect-based Scenario) for software requirements inspections, and evaluated the benefits of collection meetings after individual reviews. The results of our replication were partially different from those of the original experiment. Unlike the original experiment, we did not find any empirical evidence of better performance when using scenarios. To explain these negative findings we provide a list of hypotheses. On the other hand, the replication confirmed one result of the original experiment: the defect detection rate is not improved by collection meetings.

The independent replication was made possible by the existence of an experimental kit provided by the original investigators. We discuss what difficulties we encountered in applying the package to our environment, as a result of different cultures and skills. Using our results, experience and suggestions, other researchers will be able to improve the original experimental design before attempting further replications.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Campbell, D. T., and Stanley, J. C. 1966. Experimental and Quasi-Experimental Designs for Research. Boston: Houghton Mifflin Co.

    Google Scholar 

  • Fagan, M. E. 1976. Design and code inspections to reduce errors in program development. IBM Systems Journal 15(3): 182–211.

    Google Scholar 

  • Heninger, K. L. 1980. Specifying software requirements for complex systems: new techniques and their application. IEEE Trans. Soft. Eng. SE-6(1): 2–13.

    Google Scholar 

  • Humphrey, W. S. 1989. Managing the Software Process. New York: Addison-Wesley.

    Google Scholar 

  • IEEE Std.830-1984. IEEE Guide to Software Requirements Specification. Soft. Eng. Tech. Comm. of the IEEE Computer Society.

  • Judd, C. M., Smith, E. R., and Kidder, L. H. 1991. Research Methods in Social Relations, 6th edition. Orlando: Holt Rinehart and Winston, Inc.

    Google Scholar 

  • Parnas, D. L., and Weiss, D. M. 1985. Active design reviews: Principles and practices. Proc. 8th Int. Conf. Soft. Eng. 215–222.

  • Porter, A. A., Votta, L. G., and Basili, V. R. 1995. Comparing detection methods for software requirements inspections: A replicated experiment. IEEE Trans. Soft. Eng. 21(6): 563–575.

    Google Scholar 

  • Weinberg, G. M., and Schulman, E. L. 1974. Goals and performance in computer programming. Human Factors 16(1): 70–77.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Fusaro, P., Lanubile, F. & Visaggio, G. A Replicated Experiment to Assess Requirements Inspection Techniques. Empirical Software Engineering 2, 39–57 (1997). https://doi.org/10.1023/A:1009742216007

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/A:1009742216007

Navigation