skip to main content
10.1145/1276958.1277175acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
Article

A multi-objective approach to search-based test data generation

Published:07 July 2007Publication History

ABSTRACT

There has been a considerable body of work on search-based test data generation for branch coverage. However, hitherto, there has been no work on multi-objective branch coverage. In many scenarios a single-objective formulation is unrealistic; testers will want to find test sets that meet several objectives simultaneously in order to maximize the value obtained from the inherently expensive process of running the test cases and examining the output they produce. This paper introduces multi-objective branch coverage.The paper presents results from a case study of the twin objectives of branch coverage and dynamic memory consumption for both real and synthetic programs. Several multi-objective evolutionary algorithms are applied. The results show that multi-objective evolutionary algorithms are suitable for this problem, and illustrates the way in which a Pareto optimal search can yield insights into the trade-offs between the two simultaneous objectives.

References

  1. The Software-artifact Infrastructure Repository, http://sir.unl.edu/portal/index.html.Google ScholarGoogle Scholar
  2. James E. Baker. Reducing bias and inefficiency in the selection algorithm. In John J. Grefenstette, editor, Proceedings of the Second International Conference on Genetic Algorithms. Lawrence Erlbaum Associates, Publishers, 1987. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. André Baresel, David Binkley, Mark Harman, and Bogdan Korel. Evolutionary testing in the presence of loop-assigned flags: a testability transformation approach. In George S. Avrunin and Gregg Rothermel, editors, ISSTA, pages 108--118. ACM, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. André Baresel, Harmen Sthamer, and Michael Schmidt. Fitness function design to improve evolutionary structural testing. In GECCO 2002: Proceedings of the Genetic and Evolutionary Computation Conference, pages 1329--1336, San Francisco, CA 94104, USA, 9-13 July 2002. Morgan Kaufmann Publishers. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Leonardo Bottaci. Instrumenting programs with flag variables for test data search by genetic algorithms. In GECCO 2002: Proceedings of the Genetic and Evolutionary Computation Conference, pages 1337--1342, New York, 9-13 July 2002. Morgan Kaufmann Publishers. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Lionel C. Briand, Yvan Labiche, and Marwa Shousha. Stress testing real-time systems with genetic algorithms. In Hans-Georg Beyer and Una-May O'Reilly, editors, Genetic and Evolutionary Computation Conference, GECCO 2005, Proceedings, Washington DC, USA, June 25-29, 2005, pages 1021--1028. ACM, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. British Standards Institute. BS 7925-1 vocabulary of terms in software testing, 1998.Google ScholarGoogle Scholar
  8. Kalyanmoy Deb, Samir Agrawal, Amrit Pratab, and T. Meyarivan. A Fast Elitist Non-Dominated Sorting Genetic Algorithm for Multi-Objective Optimization: NSGA-II. KanGAL report 200001, Indian Institute of Technology, Kanpur, India, 2000.Google ScholarGoogle Scholar
  9. Richard A DeMillo and A Jefferson Offutt. Experimental results from an automatic test generator. ACM Transactions of Software Engineering and Methodology, 2(2):109--127, March 1993. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Karnig Derderian, Robert Hierons, Mark Harman, and Qiang Guo. Automated Unique Input Output sequence generation for conformance testing of FSMs. The Computer Journal, 49(3):331--344, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Mark Harman, Lin Hu, Robert Mark Hierons, Joachim Wegener, Harmen Sthamer, Andre Baresel, and Marc Roper. Testability transformation. IEEE Transactions on Software Engineering, 30(1):3--16, January 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Irman Hermadiand Moataz Ahmed. Genetic algorithm based test data generator. In Ruhul Sarker, Robert Reynolds, Hussein Abbass, Kay Chen Tan, Bob McKay, Daryl Essam, and Tom Gedeon, editors, Proceedings of the 2003 Congress on Evolutionary Computation CEC2003, pages 85--91, Canberra, 8-12 December 2003. IEEE Press.Google ScholarGoogle Scholar
  13. Arturo Hernández Aguirre, Salvador Botello Rionda, Carlos A. Coello Coello, Giovanni Lizárraga Lizárraga, and Efrén Mezura Montes. Handling Constraints using Multi-objective Optimization Concepts. International Journal for Numerical Methods in Engineering, 59(15):1989--2017, April 2004.Google ScholarGoogle ScholarCross RefCross Ref
  14. B. Jones, H. Sthamer, and D. Eyres. Automatic structural testing using genetic algorithms. Software Engineering Journal, 11(5):299--306, 1996.Google ScholarGoogle ScholarCross RefCross Ref
  15. James C. King. Symbolic execution and program testing. Communications of the ACM, 19(7):385--394, July 1976. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. B. Korel. Automated software test data generation. IEEE Transactions on Software Engineering, 16(8):870--879, 1990. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. N. Mansour and M. Salame. Data generation for path testing. Software Quality Journal, 12(2):121--134, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Silvano Martello and Paolo Toth. Knapsack Problems: Algorithms and Computer Implementations. Wiley, New York, 1990. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. G. McGraw, C. Michael, and M. Schatz. Generating software test data by evolution. IEEE Transactions on Software Engineering, 27(12):1085--1110, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. P. McMinn. Search-based software test data generation: A survey. Software Testing, Verification and Reliability, 14(2):105--156, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. P. McMinn. IGUANA: Input generation using automated novel algorithms. A plug and play research tool. Technical Report, Department of Computer Science, University of Sheffield, 2007.Google ScholarGoogle Scholar
  22. Phil McMinn, David Binkley, and Mark Harman. Testability transformation for efficient automated test data search in the presence of nesting. In Proceedings of the Third UK Software Testing Workshop, pages 165--182, September 2005.Google ScholarGoogle Scholar
  23. Heinz Mühlenbein and Dirk Schlierkamp-Voosen. Predictive models for the breeder genetic algorithm: I. continuous parameter optimization. Evolutionary Computation, 1(1):25--49, 1993.Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. A. Jefferson Offutt. An integrated system for automatically generating test data. In Raymond T. Ng, Peter A.; Ramamoorthy, C.V.; Seifert, Laurence C.; Yeh, editor, Proceedings of the First International Conference on Systems Integration, pages 694--701, Morristown, NJ, April 1990. IEEE Computer Society Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. R. Pargas, M. Harrold, and R. Peck. Test-data generation using genetic algorithms. Software Testing, Verification and Reliability, 9(4):263--282, 1999.Google ScholarGoogle ScholarCross RefCross Ref
  26. Radio Technical Commission for Aeronautics. RTCA DO178-B Software considerations in airborne systems and equipment certification, 1992.Google ScholarGoogle Scholar
  27. N. Tracey, J. Clark, and K. Mander. Automated program flaw finding using simulated annealing. In International Symposium on SoftwareTesting and Analysis(ISSTA98), pages 73--81, March 1998. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. H.-C. Wang and B. Jeng. Structural testing using memetic algorithm. In Proceedings of the Second Taiwan Conference on Software Engineering, Taipei, Taiwan, 2006.Google ScholarGoogle Scholar
  29. J. Wegener, A. Baresel, and H. Sthamer. Evolutionary test environment for automatic structural testing. Information and Software Technology, 43(14):841--854, 2001.Google ScholarGoogle ScholarCross RefCross Ref
  30. Darrell Whitley. The GENITOR algorithm and selection pressure: Why rank-based allocation. In James D. Schaffer, editor, Proc. of the Third Int. Conf. on Genetic Algorithms, pages 116--121, San Mateo, CA, 1989. Morgan Kaufmann. Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. S. Xanthakis, C. Ellis, C. Skourlas, A. Le Gall, S. Katsikas, and K. Karapoulios. Application of genetic algorithms to software testing(Application des algorithmesg'en'etiques au test des logiciels). In 5th International Conference on Software Engineering and its Applications, pages 625--636, Toulouse, France, 1992.Google ScholarGoogle Scholar
  32. M. Xiao, M. El-Attar, M. Reformat, and J. Miller. Empirical evaluation of optimization algorithms when used in goal-oriented automated test data generation techniques. Empirical Software Engineering, 12(2):183--239, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. A multi-objective approach to search-based test data generation

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      GECCO '07: Proceedings of the 9th annual conference on Genetic and evolutionary computation
      July 2007
      2313 pages
      ISBN:9781595936974
      DOI:10.1145/1276958

      Copyright © 2007 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 7 July 2007

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • Article

      Acceptance Rates

      GECCO '07 Paper Acceptance Rate266of577submissions,46%Overall Acceptance Rate1,669of4,410submissions,38%

      Upcoming Conference

      GECCO '24
      Genetic and Evolutionary Computation Conference
      July 14 - 18, 2024
      Melbourne , VIC , Australia

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader