skip to main content
10.1145/1146238.1146267acmconferencesArticle/Chapter ViewAbstractPublication PagesisstaConference Proceedingsconference-collections
Article

DSD-Crasher: a hybrid analysis tool for bug finding

Published:21 July 2006Publication History

ABSTRACT

DSD-Crasher is a bug finding tool that follows a three-step approach to program analysis:

  • D. Capture the program's intended execution behavior with dynamic invariant detection. The derived invariants exclude many unwanted values from the program's input domain.

  • S. Statically analyze the program within the restricted input domain to explore many paths.

  • D. Automatically generate test cases that focus on verifying the results of the static analysis. Thereby confirmed results are never false positives, as opposed to the high false positive rate inherent in conservative static analysis.

.This three-step approach yields benefits compared to past two-step combinations in the literature. In our evaluation with third-party applications, we demonstrate higher precision over tools that lack a dynamic step and higher efficiency over tools that lack a static step.

References

  1. Apache Software Foundation. Bytecode engineering library (BCEL). http://jakarta.apache.org/bcel/, Apr. 2003. Accessed May 2006.Google ScholarGoogle Scholar
  2. T. Ball. Abstraction-guided test generation: A case study. Technical Report MSR-TR-2003-86, Microsoft Research, Nov. 2003.Google ScholarGoogle Scholar
  3. K. Beck and E. Gamma. Test infected: Programmers love writing tests. Java Report, 3(7):37--50, July 1998.Google ScholarGoogle Scholar
  4. D. Beyer, A. J. Chlipala, T. A. Henzinger, R. Jhala, and R. Majumdar. Generating tests from counterexamples. In Proc. 26th International Conference on Software Engineering, pages 326--335, May 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. C. Boyapati, S. Khurshid, and D. Marinov. Korat: Automated testing based on Java predicates. In Proc. 2002 ACM SIGSOFT International Symposium on Software Testing and Analysis, pages 123--133. ACM Press, July 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Y. Cheon and G. T. Leavens. A simple and practical approach to unit testing: The JML and JUnit way. In B. Magnusson, editor, ECOOP 2002 - Object-Oriented Programming: 16th European Conference, volume 2374, pages 231--255. Springer, June 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. D. R. Cok and J. R. Kiniry. ESC/Java2: Uniting ESC/Java and JML: Progress and issues in building and using ESC/Java2. Technical Report NIII-R0413, Nijmegen Institute for Computing and Information Science, May 2004.Google ScholarGoogle Scholar
  8. C. Csallner and Y. Smaragdakis. JCrasher: An automatic robustness tester for Java. Software-Practice & Experience, 34(11):1025--1050, Sept. 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. C. Csallner and Y. Smaragdakis. Check 'n' Crash: Combining static checking and testing. In Proc. 27th International Conference on Software Engineering, pages 422--431, May 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. C. Csallner and Y. Smaragdakis. Dynamically discovering likely interface invariants. In Proc. International Conference on Software Engineering, Emerging Results Track, pages 861--864, May 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. D. Detlefs, G. Nelson, and J. B. Saxe. Simplify: A theorem prover for program checking. Technical Report HPL-2003-148, Hewlett-Packard Systems Research Center, July 2003.Google ScholarGoogle Scholar
  12. S. H. Edwards. A framework for practical, automated black-box testing of component-based software. Software Testing, Verification & Reliability, 11(2):97--111, June 2001.Google ScholarGoogle ScholarCross RefCross Ref
  13. M. D. Ernst. Static and dynamic analysis: Synergy and duality. In Proc. ICSE Workshop on Dynamic Analysis, pages 24--27, May 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. M. D. Ernst, J. Cockrell, W. G. Griswold, and D. Notkin. Dynamically discovering likely program invariants to support program evolution. IEEE Transactions on Software Engineering, 27(2):99--123, Feb. 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. C. Flanagan, K. R. M. Leino, M. Lillibridge, G. Nelson, J. B. Saxe, and R. Stata. Extended static checking for Java. In Proc. ACM SIGPLAN 2002 Conference on Programming Language Design and Implementation, pages 234--245, June 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. S. Hallem, B. Chelf, Y. Xie, and D. Engler. A system and language for building system-specific, static analyses. In Proc. ACM SIGPLAN 2002 Conference on Programming Language Design and Implementation, pages 69--82. ACM Press, June 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. S. Hangal and M. S. Lam. Tracking down software bugs using automatic anomaly detection. In Proc. 24th International Conference on Software Engineering, pages 291--301, May 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. M. Hapner, R. Burridge, R. Sharma, and J. Fialli. Java message service: Version 1.1. Sun Microsystems, Inc., Apr. 2002.Google ScholarGoogle Scholar
  19. D. Hovemeyer and W. Pugh. Finding bugs is easy. In Companion to the 19th annual ACM SIGPLAN conference on Object-oriented programming systems, languages, and applications, pages 132--136. ACM Press, Oct. 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. D. Jackson and M. Vaziri. Finding bugs with a constraint solver. In M. J. Harrold, editor, Proc. 2000 ACM SIGSOFT International Symposium on Software Testing and Analysis, pages 14--25. ACM Press, Aug. 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. D. Kroening, A. Groce, and E. M. Clarke. Counterexample guided abstraction refinement via program execution. In J. Davies, W. Schulte, and M. Barnett, editors, Formal Methods and Software Engineering: 6th International Conference on Formal Engineering Methods, pages 224--238. Springer, Nov. 2004.Google ScholarGoogle ScholarCross RefCross Ref
  22. G. T. Leavens, A. L. Baker, and C. Ruby. Preliminary design of JML: A behavioral interface specification language for Java. Technical Report TR98-06y, Department of Computer Science, Iowa State University, June 1998.Google ScholarGoogle Scholar
  23. K. R. M. Leino, G. Nelson, and J. B. Saxe. ESC/Java user's manual. Technical Report 2000-002, Compaq Computer Corporation Systems Research Center, Oct. 2000.Google ScholarGoogle Scholar
  24. B. Meyer. Object-Oriented Software Construction. Prentice Hall PTR, 2nd edition, 1997. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. J. W. Nimmer and M. D. Ernst. Automatic generation of program specifications. In Proc. 2002 ACM SIGSOFT International Symposium on Software Testing and Analysis, pages 229--239, July 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. J. W. Nimmer and M. D. Ernst. Invariant inference for static checking: An empirical evaluation. In Proc. ACM SIGSOFT 10th International Symposium on the Foundations of Software Engineering (FSE 2002), pages 11--20, Nov. 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. C. Pacheco and M. D. Ernst. Eclat: Automatic generation and classification of test inputs. In Proc. 19th European Conference on Object-Oriented Programming, pages 504--527, July 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Parasoft Inc. Jtest. http://www.parasoft.com/, Oct. 2002. Accessed May 2006.Google ScholarGoogle Scholar
  29. N. Rutar, C. B. Almazan, and J. S. Foster. A comparison of bug finding tools for Java. In Proc. 15th International Symposium on Software Reliability Engineering (ISSRE'04), pages 245--256. IEEE Computer Society Press, Nov. 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. S. Sankar and R. Hayes. ADL-an interface definition language for specifying and testing software. In J. M. Wing and R. L. Wexelblat, editors, Proc. workshop on Interface definition languages, volume 29, pages 13--21. ACM Press, Aug. 1994. Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. H. Schlenker and G. Ringwelski. POOC: A platform for object-oriented constraint programming. In Proc. Joint ERCIM/CologNet International Workshop on Constraint Solving and Constraint Logic Programming, pages 159--170, June 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. M. Vaziri and D. Jackson. Checking properties of heap-manipulating procedures with a constraint solver. In H. Garavel and J. Hatcliff, editors, Tools and Algorithms for the Construction and Analysis of Systems: 9th International Conference, pages 505--520. Springer, Apr. 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. T. Xie and D. Notkin. Tool-assisted unit test selection based on operational violations. In Proc. 18th Annual International Conference on Automated Software Engineering (ASE 2003), pages 40--48, Oct. 2003.Google ScholarGoogle Scholar
  34. Y. Xie and D. Engler. Using redundancies to find errors. IEEE Transactions on Software Engineering, 29(10):915--928, Oct. 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. M. Young. Symbiosis of static analysis and program testing. In Proc. 6th International Conference on Fundamental Approaches to Software Engineering, pages 1--5, Apr. 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. S. H. Zweben, W. D. Heym, and J. Kimmich. Systematic testing of data abstractions based on software specifications. Software Testing, Verification & Reliability, 1(4):39--55, Jan. 1992.Google ScholarGoogle Scholar

Index Terms

  1. DSD-Crasher: a hybrid analysis tool for bug finding

              Recommendations

              Reviews

              Josep Silva

              Nimmer and Ernst [1] propose a bug-finding tool that follows a two-step approach to program analysis. The first step dynamically detects invariants to produce an input domain. For this purpose, the Daikon tool is used to track, at runtime, a testee’s variables and generalize their behavior to invariants. The second step statically analyzes the program within the restricted input domain. In this case, the ESC/Java tool is used. ESC/Java is able to derive abstract conditions under which the execution of a method being tested may terminate abnormally. The authors extend this approach with a third step: the automatic generation of test cases to verify the results of the previous static analysis. The implementation of this step has been performed by using the bug-finding tool CnC, which combines ESC/Java and the JCrasher random testing tool. In essence, CnC takes error conditions inferred by ESC/Java, and produces test cases that are executed by JCrasher. The paper also described how this new approach—called DSD-Crasher—has been implemented and tested. In particular, the authors remark that the new schema cannot be evaluated with the metrics used in Nimmer and Ernst’s work [1]. Consequently, they use another metric (end-to-end efficiency) to measure and compare their implementation. The comparative study performed suggests that DSD-Crasher significantly reduces the number of false positives when finding bugs, making this method an improvement over previous approaches. The paper is clear and well written, and both the advantages and disadvantages of the method are described. However, for a nonexpert reader, the structure of the paper can be a bit confusing, due to the description of several debugging tools that are posteriorly needed to introduce DSD-Crasher. For this reason, I would suggest reading section 3.1 (the motivation) first, and then the rest of the paper. Online Computing Reviews Service

              Access critical reviews of Computing literature here

              Become a reviewer for Computing Reviews.

              Comments

              Login options

              Check if you have access through your login credentials or your institution to get full access on this article.

              Sign in

              PDF Format

              View or Download as a PDF file.

              PDF

              eReader

              View online with eReader.

              eReader