Skip to main content

Requirements Coverage as an Adequacy Measure for Conformance Testing

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNPSE,volume 5256))

Abstract

Conformance testing in model-based development refers to the testing activity that verifies whether the code generated (manually or automatically) from the model is behaviorally equivalent to the model. Presently the adequacy of conformance testing is inferred by measuring structural coverage achieved over the model. We hypothesize that adequacy metrics for conformance testing should consider structural coverage over the requirements either in place of or in addition to structural coverage over the model. Measuring structural coverage over the requirements gives a notion of how well the conformance tests exercise the required behavior of the system.

We conducted an experiment to investigate the hypothesis stating structural coverage over formal requirements is more effective than structural coverage over the model as an adequacy measure for conformance testing. We found that the hypothesis was rejected at 5% statistical significance on three of the four case examples in our experiment. Nevertheless, we found that the tests providing requirements coverage found several faults that remained undetected by tests providing model coverage. We thus formed a second hypothesis stating that complementing model coverage with requirements coverage will prove more effective as an adequacy measure than solely using model coverage for conformance testing. In our experiment, we found test suites providing both requirements coverage and model coverage to be more effective at finding faults than test suites providing model coverage alone, at 5% statistical significance. Based on our results, we believe existing adequacy measures for conformance testing that only consider model coverage can be strengthened by combining them with rigorous requirements coverage metrics.

This work has been partially supported by NASA Ames Research Center Cooperative Agreement NNA06CB21A, NASA IV&V Facility Contract NNG-05CB16C, and the L-3 Titan Group.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Andrews, J.H., Briand, L.C., Labiche, Y.: Is Mutation an Appropriate Tool for Testing Experiments? In: Proceedings of the 27th International Conference on Software Engineering (ICSE), pp. 402–411 (2005)

    Google Scholar 

  2. Beer, I., Ben-David, S., Eisner, C., Rodeh, Y.: Efficient detection of vacuity in ACTL formulas. In: Formal Methods in System Design, pp. 141–162 (2001)

    Google Scholar 

  3. Briand, L.C., Di Penta, M., Labiche, Y.: Assessing and Improving State-Based Class Testing: A Series of Experiments. IEEE Transactions on Software Engineering 30(11) (2004)

    Google Scholar 

  4. Chilenski, J.J., Miller, S.P.: Applicability of Modified Condition/Decision Coverage to Software Testing. Software Engineering Journal, 193–200 (September 1994)

    Google Scholar 

  5. Clarke, E.M., Grumberg, O., Peled, D.: Model Checking. MIT Press, Cambridge (1999)

    Google Scholar 

  6. Fisher, R.A.: The Design of Experiment. Hafner, New York (1935)

    Google Scholar 

  7. Gargantini, A., Heitmeyer, C.: Using model checking to generate tests from requirements specifications. Software Engineering Notes 24(6), 146–162 (1999)

    Article  Google Scholar 

  8. Harel, D., Marelly, R.: Come Let’s Play: Scenario-Based Programming Using LSC’s and the Play-Engine. Springer, New York (2003)

    Book  Google Scholar 

  9. Hayhurst, K.J., Veerhusen, D.S., Rierson, L.K.: A practical tutorial on modified condition/decision coverage. Technical Report TM-2001-210876, NASA (2001)

    Google Scholar 

  10. Kupferman, O., Vardi, M.Y.: Vacuity detection in temporal model checking. Journal on Software Tools for Technology Transfer 4(2) (February 2003)

    Google Scholar 

  11. Kvam, P.H., Vidakovic, B.: Nonparametric Statistics with Applications to Science and Engineering (2007)

    Google Scholar 

  12. Mathworks Inc. Simulink product web site, http://www.mathworks.com/products/simulink

  13. The NuSMV Toolset (2005), http://nusmv.irst.itc.it/

  14. Offutt, A.J., Pan, J.: Automatically detecting equivalent mutants and infeasible paths. Software Testing, Verification & Reliability 7(3), 165–192 (1997)

    Article  Google Scholar 

  15. Purandare, M., Somenzi, F.: Vacuum cleaning CTL formulae. In: Proceedings of the 14th Conference on Computer Aided Design, pp. 485–499. Springer, Heidelberg (2002)

    Google Scholar 

  16. Rajan, A., Whalen, M.W., Heimdahl, M.P.E.: The Effect of Program and Model Structure on MC/DC Test Adequacy Coverage. In: Proceedings of 30th International Conference on Software Engineering (ICSE) (to appear, 2008), http://crisys.cs.umn.edu/ICSE08.pdf

  17. Rayadurgam, S.: Automatic Test-case Generation from Formal Models of Software. PhD thesis, University of Minnesota (November 2003)

    Google Scholar 

  18. Rayadurgam, S., Heimdahl, M.P.E.: Coverage based test-case generation using model checkers. In: Proceedings of the 8th Annual IEEE International Conference and Workshop on the Engineering of Computer Based Systems (ECBS 2001), April 2001, pp. 83–91. IEEE Computer Society Press, Los Alamitos (2001)

    Google Scholar 

  19. Rayadurgam, S., Heimdahl, M.P.E.: Generating MC/DC adequate test sequences through model checking. In: Proceedings of the 28th Annual IEEE/NASA Software Engineering Workshop – SEW 2003, Greenbelt, Maryland (December 2003)

    Google Scholar 

  20. RTCA. DO-178B: Software Consideration. In: Airborne Systems and Equipment Certification. RTCA (1992)

    Google Scholar 

  21. Whalen, M.W., Rajan, A., Heimdahl, M.P.E.: Coverage metrics for requirements-based testing. In: Proceedings of International Symposium on Software Testing and Analysis (July 2006)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2008 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Rajan, A., Whalen, M., Staats, M., Heimdahl, M.P.E. (2008). Requirements Coverage as an Adequacy Measure for Conformance Testing. In: Liu, S., Maibaum, T., Araki, K. (eds) Formal Methods and Software Engineering. ICFEM 2008. Lecture Notes in Computer Science, vol 5256. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-88194-0_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-88194-0_8

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-88193-3

  • Online ISBN: 978-3-540-88194-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics