ABSTRACT
Several research efforts have studied ways to infer properties of software systems from program spectra gathered from the running systems, usually with software-level instrumentation. One specific application of this general approach, which is the focus of this paper, is distinguishing failed executions from successful executions. While existing efforts appear to produce accurate classifications, detailed understanding of their costs and potential cost-benefit tradeoffs is lacking. In this work, we present a hybrid instrumentation approach which uses hardware performance counters to gather program spectra at very low cost. This underlying data is further augmented with data captured by minimal amounts of software-level instrumentation. We also evaluate this hybrid approach by comparing it to other existing approaches. We conclude that these hybrid spectra can reliably distinguish failed executions from successful executions at a fraction of the runtime overhead cost of using software-only spectra.
- H. Agrawal, J. Horgan, S. London, and W. Wong. Fault localization using execution slices and dataflow tests. In ISSRE Conference Proceedings, 1995.Google ScholarCross Ref
- J. M. Anderson, L. M. Berc, J. Dean, S. Ghemawat, M. R. Henzinger, S.-T. A. Leung, R. L. Sites, M. T. Vandevoorde, C. A. Waldspurger, and W. E. Weihl. Continuous profiling: where have all the cycles gone? ACM Trans. Comput. Syst., 15(4):357-390, 1997. Google ScholarDigital Library
- J. F. Bowring, J. M. Rehg, and M. J. Harrold. Active learning for automatic classification of software behavior. In Proc. of the Int'l Symp. on Software Testing and Analysis (ISSTA 2004), pages 195-205, July 2004. Google ScholarDigital Library
- Y. Brun and M. D. Ernst. Finding latent code errors via machine learning over program executions. In Proc. of the 26th Int'l Conf. on SW Eng. (ICSE 2004), pages 480-490, 2004. Google ScholarDigital Library
- R. E. Bryant and D. R. O'Hallaron. Computer Systems: A Programmer's Perspective. Prentice Hall, 2002. Google ScholarDigital Library
- M. Y. Chen, E. Kiciman, E. Fratkin, A. Fox, and E. Brewer. Pinpoint: Problem determination in large, dynamic internet services. Dependable Systems and Networks, International Conference on, 0:595-604, 2002. Google ScholarDigital Library
- W. Dickinson, D. Leon, and A. Podgurski. Pursuing failure: the distribution of program failures in a profile space. In Proc. of the 9th ACM SIGSOFT international symposium on Foundations of SW Eng., pages 246-255, September 2001. Google ScholarDigital Library
- W. Dickinson, D. Leon, and A. Podgursky. Finding failures by cluster analysis of execution profiles. In Proc. of the 23rd Int'l Conf. on SW Eng. (ICSE 2001), pages 339-348, May 2001. Google ScholarDigital Library
- H. Do, S. Elbaum, and G. Rothermel. Supporting controlled experimentation with testing techniques: An infrastructure and its potential impact. Empirical Soft. Eng., 10(4):405-435, 2005. Google ScholarDigital Library
- M. Hall, E. Frank, G. Holmes, B. Pfahringer, P. Reutemann, and I. H. Witten. The weka data mining software: An update. SIGKDD Explorations, 11(1):10-19, 2009. Google ScholarDigital Library
- M. Haran, A. Karr, A. Orso, A. Porter, and A. Sanil. Applying classification techniques to remotely-collected program execution data. SIGSOFT Softw. Eng. Notes, 30(5):146-155, 2005. Google ScholarDigital Library
- G. Hoglund and G. McGraw. Exploiting software: How to break code. Addison-Wesley Publishing Company. Google ScholarDigital Library
- J. A. Jones, M. J. Harrold, and J. Stasko. Visualization of test information to assist fault localization. In ICSE Conference Proceedings, pages 467-477, 2002. Google ScholarDigital Library
- D. Leon, A. Podgurski, and L. J. White. Multivariate visualization in observation-based testing. In Proc. of the 22nd international conference on SW engineering (ICSE 2000), pages 116-125, May 2000. Google ScholarDigital Library
- B. Liblit, A. Aiken, A. X. Zheng, and M. I. Jordan. Bug isolation via remote program sampling. SIGPLAN Not., 38(5):141-154, 2003. Google ScholarDigital Library
- B. Liblit, A. Aiken, A. X. Zheng, and M. I. Jordan. Bug isolation via remote program sampling. In Proceedings of the Conference on Programming Language Design and Implementation (PLDI 2003), pages 141-154, June 2003. Google ScholarDigital Library
- B. Liblit, M. Naik, A. X. Zheng, A. Aiken, and M. I. Jordan. Scalable statistical bug isolation. In Proc. of the Conf. on Programming Language Design and Implementation (PLDI 2005), June 2005. Google ScholarDigital Library
- A. Podgurski, D. Leon, P. Francis, W. Masri, M. Minch, J. Sun, and B. Wang. Automated support for classifying software failure reports. In ICSE Conference Proceedings, pages 465-475, 2003. Google ScholarDigital Library
- A. Podgurski, D. Leon, P. Francis, W. Masri, M. M. Sun, and B. Wang. Automated support for classifying sw failure reports. In Proc. of the 25th Int'l Conf. on SW Eng. (ICSE 2003), pages 465-474, May 2003. Google ScholarDigital Library
- M. Renieris and S. P. Reiss. Fault localization with nearest neighbor queries. In ASE Conference Proceedings.Google Scholar
- R. Santelices, J. A. Jones, Y. Yanbing, and M. J. Harrold. Lightweight fault-localization using multiple coverage types. In ICSE Conference Proceedings, pages 56-66, 2009. Google ScholarDigital Library
- S. Singer, K. Gross, J. Herzog, S. Wegerich, and W. King. Model-based nuclear power plant monitoring and fault detection: theoretical foundations. In Proceedings of the International Conference on Intelligent Systems Applications to Power Systems, pages 60-65, 1997.Google Scholar
- R. Vilalta and S. Ma. Predicting rare events in temporal domains. In ICDM Proceedings, pages 474-481, 2002. Google ScholarDigital Library
- C. Yilmaz, A. Paradkar, and C. Williams. Time will tell: fault localization using time spectra. In ICSE Conference Proceedings, pages 81-90, 2008. Google ScholarDigital Library
Index Terms
- Combining hardware and software instrumentation to classify program executions
Recommendations
The CSI Framework for Compiler-Inserted Program Instrumentation
SIGMETRICS '18The CSI framework provides comprehensive static instrumentation that a compiler can insert into a program-under-test so that dynamic-analysis tools - memory checkers, race detectors, cache simulators, performance profilers, code-coverage analyzers, etc. ...
Testing Program Segments to Detect Runtime Exceptions in Java
Structured Object-Oriented Formal Language and MethodAbstractRuntime exceptions are difficult to be detected by static analysis tools and their occurrences in runtime often cause software systems to crash or unexcepted termination. Therefore, it is necessary to detect the existence of runtime exceptions in ...
The CSI Framework for Compiler-Inserted Program Instrumentation
The CSI framework provides comprehensive static instrumentation that a compiler can insert into a program-under-test so that dynamic-analysis tools - memory checkers, race detectors, cache simulators, performance profilers, code-coverage analyzers, etc. -...
Comments