ABSTRACT
In black-box testing, one is interested in creating a suite of tests from requirements that adequately exercise the behavior of a software system without regard to the internal structure of the implementation. In current practice, the adequacy of black box test suites is inferred by examining coverage on an executable artifact, either source code or a software model.In this paper, we define structural coverage metrics directly on high-level formal software requirements. These metrics provide objective, implementation-independent measures of how well a black-box test suite exercises a set of requirements. We focus on structural coverage criteria on requirements formalized as LTL properties and discuss how they can be adapted to measure finite test cases. These criteria can also be used to automatically generate a requirements-based test suite. Unlike model or code-derived test cases, these tests are immediately traceable to high-level requirements. To assess the practicality of our approach, we apply it on a realistic example from the avionics domain.
- P. E. Ammann and P. E. Black. A specification-based coverage metric to evaluate test sets. In Proceedings of the Fourth IEEE International Symposium on High-Assurance Systems Engineering. IEEE Computer Society, Nov. 1999.]] Google ScholarDigital Library
- R. Armoni, D. Bustan, O. Kupferman, and M. Y. Vardi. Aborts vs. resets in linear temporal logic. In TACAS, pages 65--80, November 2003.]]Google ScholarCross Ref
- I. Beer, S. Ben-David, C. Eisner, and Y. Rodeh. Efficient detection of vacuity in ACTL formulas. In Formal Methods in System Design, pages 141--162, 2001.]] Google ScholarDigital Library
- B. Bezier. Software Testing Techniques, 2nd Edition. Van Nostrand Reinhold, New York, 1990.]] Google ScholarDigital Library
- J. J. Chilenski and S. P. Miller. Applicability of modified condition/decision coverage to software testing. Software Engineering Journal, pages 193--200, September 1994.]]Google ScholarCross Ref
- H. Chockler, O. Kupferman, R. P. Kurshan, and M. Y. Vardi. A practical approach to coverage in model checking. In Proceedings of the International Conference on Computer Aided Verification (CAV01), Lecture Notes in Computer Science 2102, pages 66--78. Springer-Verlag, July 2001.]] Google ScholarDigital Library
- H. Chockler, O. Kupferman, and M. Y. Vardi. Coverage metrics for temporal logic model checking. In Proceedings of the International Conference on Tools and Algorithms for the Construction and Analysis of Systems, Lecture Notes in Computer Science 2031, pages 528--542. Springer-Verlag, April 2001.]] Google ScholarDigital Library
- H. Chockler, O. Kupferman, and M. Y. Vardi. Coverage metrics for formal verification. In 12th Advanced Research Working Conference on Correct Hardware Design and Verification Methods, volume 2860 of Lecture Notes in Computer Science, pages 111--125. Springer-Verlag, October 2003.]]Google ScholarCross Ref
- E. M. Clarke, O. Grumberg, and D. Peled. Model Checking. MIT Press, 1999.]] Google ScholarDigital Library
- C. Eisner, D. Fisman, J. Havlicek, Y. Lustig, A. McIsaac, and D. V. Campenhout. Reasoning with temporal logic on truncated paths. In Proceedings of Computer Aided Verification (CAV), pages 27--39, 2003.]]Google ScholarCross Ref
- A. Gargantini and C. Heitmeyer. Using model checking to generate tests from requirements specifications. Software Engineering Notes, 24(6):146--162, November 1999.]] Google ScholarDigital Library
- N. Halbwachs, P. Caspi, P. Raymond, and D. Pilaud. The synchronous dataflow programming language Lustre. Proceedings of the IEEE, 79(9):1305--1320, September 1991.]]Google ScholarCross Ref
- K. Hayhurst, D. Veerhusen, and L. Rierson. A practical tutorial on modified condition/decision coverage. Technical Report TM-2001--210876, NASA, 2001.]] Google ScholarDigital Library
- M. P. Heimdahl and G. Devaraj. Test-suite reduction for model based tests: Effects on test quality and implications for testing. In Proceedings of the 19th IEEE International Conference on Automated Software Engineering (ASE), Linz, Austria, September 2004.]] Google ScholarDigital Library
- M. P. Heimdahl, S. Rayadurgam, and W. Visser. Specification centered testing. In Second International Workshop on Analysis, Testing and Verification, May 2001.]]Google Scholar
- O. Kupferman and M. Y. Vardi. Vacuity detection in temporal model checking. Journal on Software Tools for Technology Transfer, 4(2), February 2003.]]Google Scholar
- Z. Manna and A. Pnueli. Temporal verification of reactive systems: Safety. Technical report, Springer-Verlag, New York, 1995.]] Google ScholarDigital Library
- S. Miller, A. Tribble, T. Carlson, and E. J. Danielson. Flight guidance system requirements specification. Technical Report CR-2003--212426, NASA, June 2003.]]Google Scholar
- S. P. Miller, M. P. Heimdahl, and A. Tribble. Proving the shalls. In Proceedings of FM 2003: the 12th International FME Symposium, September 2003.]]Google ScholarCross Ref
- A. J. Offutt, Y. Xiong, and S. Liu. Criteria for generating specification-based tests. In Proceedings of the Fifth IEEE International Conference on Engineering of Complex Computer Systems (ICECCS '99), October 1999.]] Google ScholarDigital Library
- M. Purandare and F. Somenzi. Vacuum cleaning CTL formulae. In Proceedings of the 14th Conference on Computer Aided Design, pages 485--499.Springer-Verlag, 2002.]] Google ScholarDigital Library
- S. Rayadurgam. Automatic Test-case Generation from Formal Models of Software. PhD thesis, University of Minnesota, November 2003.]] Google ScholarDigital Library
- S. Rayadurgam and M. P. Heimdahl. Coverage based test-case generation using model checkers. In Proceedings of the 8th Annual IEEE International Conference and Workshop on the Engineering of Computer Based Systems (ECBS 2001), pages 83--91. IEEE Computer Society, April 2001.]]Google ScholarCross Ref
- S. Rayadurgam and M. P. Heimdahl. Generating MC/DC adequate test sequences through model checking. In Proceedings of the 28th Annual IEEE/NASA Software Engineering Workshop - SEW-03, Greenbelt, Maryland, December 2003.]]Google ScholarCross Ref
- Reactive Systems Inc. Reactis product description. http://www.reactive-systems.com/index.msp.]]Google Scholar
- D. J. Richardson, S. L. Aha, and T. O'Malley. Specification-based test oracles for reactive systems. In Proceedings of the 14th International Conference on Software Engineering, pages 105--118. Springer, May 1992.]] Google ScholarDigital Library
- RTCA. Software Considerations In Airborne Systems and Equipment Certification. RTCA, 1992.]]Google Scholar
- L. Tan, O. Sokolsky, and I. Lee. Specification-based testing with linear temporal logic. In IEEE Int. Conf. on Information Reuse and Integration (IEEE IRI-2004), November 2004.]]Google Scholar
- E. Technologies. Scade suite product description. http://www.esterel-technologies.com/v2/ scadeSuiteFor- SafetyCriticalSoftwareDevelopment/index.html, 2004.]]Google Scholar
- M. W. Whalen. A formal semantics for RSML-e. Master's thesis, University of Minnesota, May 2000.]]Google Scholar
Index Terms
- Coverage metrics for requirements-based testing
Recommendations
Requirements Coverage as an Adequacy Measure for Conformance Testing
ICFEM '08: Proceedings of the 10th International Conference on Formal Methods and Software EngineeringConformance testing in model-based development refers to the testing activity that verifies whether the code generated (manually or automatically) from the model is behaviorally equivalent to the model. Presently the adequacy of conformance testing is ...
Optimal Regression Testing Based on Selective Coverage of Test Requirements
ISPA '10: Proceedings of the International Symposium on Parallel and Distributed Processing with ApplicationsTest suite reduction can greatly save the test effort during regression testing. For software development with frequent minor updates, a good policy is to managing selective form regression testing, where the total set of test requirements is divided ...
Comments