skip to main content
article

A framework and tool support for the systematic testing of model-based specifications

Published:01 October 2003Publication History
Skip Abstract Section

Abstract

Formal specifications can precisely and unambiguously define the required behavior of a software system or component. However, formal specifications are complex artifacts that need to be verified to ensure that they are consistent, complete, and validated against the requirements. Specification testing or animation tools exist to assist with this by allowing the specifier to interpret or execute the specification. However, currently little is known about how to do this effectively.This article presents a framework and tool support for the systematic testing of formal, model-based specifications. Several important generic properties that should be satisfied by model-based specifications are first identified. Following the idea of mutation analysis, we then use variants or mutants of the specification to check that these properties are satisfied. The framework also allows the specifier to test application-specific properties. All properties are tested for a range of states that are defined by the tester in the form of a testgraph, which is a directed graph that partially models the states and transitions of the specification being tested. Tool support is provided for the generation of the mutants, for automatically traversing the testgraph and executing the test cases, and for reporting any errors. The framework is demonstrated on a small specification and its application to three larger specifications is discussed. Experience indicates that the framework can be used effectively to test small to medium-sized specifications and that it can reveal a significant number of problems in these specifications.

References

  1. Ammann, P. and Black, P. 1999. A specification-based coverage metric to evaluate test suites. Int. J. Reliab., Qual. Saf. Engg. 8, 4, pp. 275--300.]]Google ScholarGoogle ScholarCross RefCross Ref
  2. Ammann, P., Black, P., and Majurski, W. 1998. Using model checking to generate tests from specifications. In Proceedings of the 2nd IEEE International Conference on Formal Engineering Methods. IEEE Computer Society, pp. 46--54.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Callahan, J. S. E., and Montgomery, T. 1998. Generating test oracles via model checking. Technical Report NASA-IVV-98-015, NASA/West Virginia University Software Research Laboratory.]]Google ScholarGoogle Scholar
  4. Carrington, D., MacColl, I., McDonald, J., Murray, L., and Strooper, P. 2000. From Object-Z specifications to Classbench test suites. J. Softw. Test., Verif. Reliab. 10, 2, pp. 111--137.]]Google ScholarGoogle ScholarCross RefCross Ref
  5. Carrington, D. and Stocks, P. 1994. A tale of two paradigms: Formal methods and software testing. In ZUM'94, Z User Workshop, J.P.Bowen and J. Hall, Eds. Springer-Verlag, pp. 51--68.]]Google ScholarGoogle Scholar
  6. Chang, K. H., Liao, S. S., Seidman, S. B., and Chapman, E. 1998. Testing object-oriented programs: from formal specification to test scenario generation. J. Syst. Softw. 42, 2, pp. 141--151.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Clarke, E., Grumberg, O., and Peled, D. A. 1999. Model Checking. MIT Press, Cambridge, Mass.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. DeMillo, R., Lipton, R., and Sayward, F. 1978. Hints on test data selection: Help for the practicing programmer. IEEE Comput. 11, 4, pp. 34--41.]]Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Dick, J. and Faivre, A. 1993. Automating the generation and sequencing of test cases from model-based specifications. In Formal Methods Europe (FME'93): Industrial-Strength Formal Methods. Springer-Verlag, pp. 268--284.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Duke, R. and Rose, G. 2000. Formal Object-Oriented Specification Using Object-Z. MacMillan Press Limited, London.]]Google ScholarGoogle Scholar
  11. Hamilton, N., Hazel, D., Kearney, P., Traynor, O., and Wildman, L. 1998. A complete formal development using Cogito. In Computer Science '98: Proceedings of the 21st Australasian Computer Science Conference. Springer-Verlag, pp. 319--330.]]Google ScholarGoogle Scholar
  12. Hazel, D., Strooper, P., and Traynor, O. 1997. Possum: An animator for the SUM specification language. In Proceedings of Asia-Pacific Software Engineering Conference and International Computer Science Conference. IEEE Computer Society, pp. 42--51.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Hazel, D., Strooper, P., and Traynor, O. 1998. Requirements engineering and verification using specification animation. In the 13th IEEE International Conference on Automated Software Engineering. IEEE Computer Society, pp. 302--305.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Heitmeyer, C., Kirby, J., Labaw, B., and Bharadwaj, R. 1998. SCR*: A toolset for specifying and analysing software requirements. In Proceedings of Computer-Aided Verification 10th Annual Conference (CAV'98). Springer-Verlag, pp. 526--531.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Hewitt, M., O'Halloran, C., and Sennett, C. 1997. Experiences with PiZA, an animator for Z. In ZUM'97: The Z Formal Specification Notation. Springer-Verlag, pp. 37--51.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Hierons, R. M. 1997. Testing from a Z specification. Softw. Test., Verif. Reliab. 7, 1, pp. 19--33.]]Google ScholarGoogle ScholarCross RefCross Ref
  17. Hoffman, D. M. and Strooper, P. A. 1995. The testgraphs methodology---automated testing of collection classes. J. Obj.-Orient. Program., pp. 35--41.]]Google ScholarGoogle Scholar
  18. Hoffman, D. M. and Strooper, P. A. 1998. ClassBench: A methodology and framework for automated class testing. In Testing Object-Oriented Software, D. C. Kung, P. Hsia, and J. Gao, Eds. IEEE Computer Society, pp. 152--176.]]Google ScholarGoogle Scholar
  19. IFAD. Features of VDM tools. http://www.ifad.dk/products/vdmtools/features.htm.]]Google ScholarGoogle Scholar
  20. Kazmierczak, E., Dart, P., Stirling, L., and Winikoff, M. 2000. Verifying requirements through mathematical modelling and animation. Int. J. Softw. Engg. Knowl. Engg. 10, 2, pp. 251--273.]]Google ScholarGoogle ScholarCross RefCross Ref
  21. Kazmierczak, E., Kearney, P., Traynor, O., and Wang, L. 1995. A modular extension to Z for specification, reasoning and refinement. Technical report 95-15, Software Verification Research Centre. February.]]Google ScholarGoogle Scholar
  22. Legeard, B., Peureux, F., and Utting, M. 2002. Automated boundary testing from Z and B. In Proceedings of Formal Methods Europe 2002. Springer, pp. 21--40.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Liu, S. 1999. Verifying consistency and validity of formal specifications by testing. In Proceedings of the World Congress on Formal Methods in the Development of Computing Systems. Springer-Verlag, pp. 896--914.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Miller, T. and Strooper, P. 2001. Animation can show only the presence of errors, never their absence. In the Australian Software Engineering Conference (ASWEC 2001) (Canberra, Australia). IEEE Computer Society Press, pp. 76--85.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Miller, T. and Strooper, P. 2002a. A framework and tool support for the systematic testing of model-based specifications. Technical Report 02-35, SVRC, The University of Queensland, Australia.]]Google ScholarGoogle Scholar
  26. Miller, T. and Strooper, P. 2002b. Model-based animation using testgraphs. In the International Conference on Formal Engineering Methods. Springer Verlag, pp. 192--203.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Miller, T. and Strooper, P. 2003. Supporting the software testing process through specification animation. In the 1st International Conference on Software Engineering and Formal Methods. IEEE Computer Society, pp. 14--23.]]Google ScholarGoogle Scholar
  28. Murray, L., Carrington, D., MacColl, I., McDonald, J., and Strooper, P. 1998. Formal derivation of finite state machines for class testing. In ZUM'98: The Z Formal Specification Notation, J. B. Hinchey, A. Fett, and M.G., Eds. Vol. 1493. Springer-Verlag, pp. 42--59.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Robinson, N., Barney, D., Kearney, P., Nikandros, G., and Tombs, D. 2001. Automatic generation and verification of design specifications for railway signalling applications. In Proceedings of INCOSE 11th Annual International Symposium.]]Google ScholarGoogle Scholar
  30. Schneider, S. 2001. The B-Method: An Introduction. Palgrave.]]Google ScholarGoogle Scholar
  31. Stocks, P. and Carrington, D. 1996. A framework for specification-based testing. IEEE Trans. Softw. Engg. 22, 11, pp. 777--793.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Treharne, H., Ormsby, B., Draper, J., and Boyce, T. 1996. Evaluating the B-Method on an avionics example. In Proceedings of DASIA Conference. pp. 89--97.]]Google ScholarGoogle Scholar
  33. Turner, C. and Robson, D. 1995. A state-based approach to the testing of class-based programs. Softw.--Conc. Tools 16, 3, pp. 106--112.]]Google ScholarGoogle Scholar
  34. Utting, M. 2000. Data structures for Z testing tools, FM-TOOLS Workshop.]]Google ScholarGoogle Scholar
  35. Waeselynck, H. and Behnia, S. 1998. B-Model animation for external verification. In Proceedings of the International Conference for Formal Engineering Methods. IEEE Computer Society, pp. 36--45.]] Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. A framework and tool support for the systematic testing of model-based specifications

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in

        Full Access

        • Published in

          cover image ACM Transactions on Software Engineering and Methodology
          ACM Transactions on Software Engineering and Methodology  Volume 12, Issue 4
          October 2003
          103 pages
          ISSN:1049-331X
          EISSN:1557-7392
          DOI:10.1145/990010
          Issue’s Table of Contents

          Copyright © 2003 ACM

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 1 October 2003
          Published in tosem Volume 12, Issue 4

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • article

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader