skip to main content
10.1145/1882291.1882304acmconferencesArticle/Chapter ViewAbstractPublication PagesfseConference Proceedingsconference-collections
research-article

Combining hardware and software instrumentation to classify program executions

Published:07 November 2010Publication History

ABSTRACT

Several research efforts have studied ways to infer properties of software systems from program spectra gathered from the running systems, usually with software-level instrumentation. One specific application of this general approach, which is the focus of this paper, is distinguishing failed executions from successful executions. While existing efforts appear to produce accurate classifications, detailed understanding of their costs and potential cost-benefit tradeoffs is lacking. In this work, we present a hybrid instrumentation approach which uses hardware performance counters to gather program spectra at very low cost. This underlying data is further augmented with data captured by minimal amounts of software-level instrumentation. We also evaluate this hybrid approach by comparing it to other existing approaches. We conclude that these hybrid spectra can reliably distinguish failed executions from successful executions at a fraction of the runtime overhead cost of using software-only spectra.

References

  1. H. Agrawal, J. Horgan, S. London, and W. Wong. Fault localization using execution slices and dataflow tests. In ISSRE Conference Proceedings, 1995.Google ScholarGoogle ScholarCross RefCross Ref
  2. J. M. Anderson, L. M. Berc, J. Dean, S. Ghemawat, M. R. Henzinger, S.-T. A. Leung, R. L. Sites, M. T. Vandevoorde, C. A. Waldspurger, and W. E. Weihl. Continuous profiling: where have all the cycles gone? ACM Trans. Comput. Syst., 15(4):357-390, 1997. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. J. F. Bowring, J. M. Rehg, and M. J. Harrold. Active learning for automatic classification of software behavior. In Proc. of the Int'l Symp. on Software Testing and Analysis (ISSTA 2004), pages 195-205, July 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Y. Brun and M. D. Ernst. Finding latent code errors via machine learning over program executions. In Proc. of the 26th Int'l Conf. on SW Eng. (ICSE 2004), pages 480-490, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. R. E. Bryant and D. R. O'Hallaron. Computer Systems: A Programmer's Perspective. Prentice Hall, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. M. Y. Chen, E. Kiciman, E. Fratkin, A. Fox, and E. Brewer. Pinpoint: Problem determination in large, dynamic internet services. Dependable Systems and Networks, International Conference on, 0:595-604, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. W. Dickinson, D. Leon, and A. Podgurski. Pursuing failure: the distribution of program failures in a profile space. In Proc. of the 9th ACM SIGSOFT international symposium on Foundations of SW Eng., pages 246-255, September 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. W. Dickinson, D. Leon, and A. Podgursky. Finding failures by cluster analysis of execution profiles. In Proc. of the 23rd Int'l Conf. on SW Eng. (ICSE 2001), pages 339-348, May 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. H. Do, S. Elbaum, and G. Rothermel. Supporting controlled experimentation with testing techniques: An infrastructure and its potential impact. Empirical Soft. Eng., 10(4):405-435, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. M. Hall, E. Frank, G. Holmes, B. Pfahringer, P. Reutemann, and I. H. Witten. The weka data mining software: An update. SIGKDD Explorations, 11(1):10-19, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. M. Haran, A. Karr, A. Orso, A. Porter, and A. Sanil. Applying classification techniques to remotely-collected program execution data. SIGSOFT Softw. Eng. Notes, 30(5):146-155, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. G. Hoglund and G. McGraw. Exploiting software: How to break code. Addison-Wesley Publishing Company. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. J. A. Jones, M. J. Harrold, and J. Stasko. Visualization of test information to assist fault localization. In ICSE Conference Proceedings, pages 467-477, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. D. Leon, A. Podgurski, and L. J. White. Multivariate visualization in observation-based testing. In Proc. of the 22nd international conference on SW engineering (ICSE 2000), pages 116-125, May 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. B. Liblit, A. Aiken, A. X. Zheng, and M. I. Jordan. Bug isolation via remote program sampling. SIGPLAN Not., 38(5):141-154, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. B. Liblit, A. Aiken, A. X. Zheng, and M. I. Jordan. Bug isolation via remote program sampling. In Proceedings of the Conference on Programming Language Design and Implementation (PLDI 2003), pages 141-154, June 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. B. Liblit, M. Naik, A. X. Zheng, A. Aiken, and M. I. Jordan. Scalable statistical bug isolation. In Proc. of the Conf. on Programming Language Design and Implementation (PLDI 2005), June 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. A. Podgurski, D. Leon, P. Francis, W. Masri, M. Minch, J. Sun, and B. Wang. Automated support for classifying software failure reports. In ICSE Conference Proceedings, pages 465-475, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. A. Podgurski, D. Leon, P. Francis, W. Masri, M. M. Sun, and B. Wang. Automated support for classifying sw failure reports. In Proc. of the 25th Int'l Conf. on SW Eng. (ICSE 2003), pages 465-474, May 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. M. Renieris and S. P. Reiss. Fault localization with nearest neighbor queries. In ASE Conference Proceedings.Google ScholarGoogle Scholar
  21. R. Santelices, J. A. Jones, Y. Yanbing, and M. J. Harrold. Lightweight fault-localization using multiple coverage types. In ICSE Conference Proceedings, pages 56-66, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. S. Singer, K. Gross, J. Herzog, S. Wegerich, and W. King. Model-based nuclear power plant monitoring and fault detection: theoretical foundations. In Proceedings of the International Conference on Intelligent Systems Applications to Power Systems, pages 60-65, 1997.Google ScholarGoogle Scholar
  23. R. Vilalta and S. Ma. Predicting rare events in temporal domains. In ICDM Proceedings, pages 474-481, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. C. Yilmaz, A. Paradkar, and C. Williams. Time will tell: fault localization using time spectra. In ICSE Conference Proceedings, pages 81-90, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Combining hardware and software instrumentation to classify program executions

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      FSE '10: Proceedings of the eighteenth ACM SIGSOFT international symposium on Foundations of software engineering
      November 2010
      302 pages
      ISBN:9781605587912
      DOI:10.1145/1882291

      Copyright © 2010 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 7 November 2010

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate17of128submissions,13%

      Upcoming Conference

      FSE '24

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader