skip to main content
10.1145/2188286.2188344acmconferencesArticle/Chapter ViewAbstractPublication PagesicpeConference Proceedingsconference-collections
research-article

Automated detection of performance regressions using statistical process control techniques

Published:22 April 2012Publication History

ABSTRACT

The goal of performance regression testing is to check for performance regressions in a new version of a software system. Performance regression testing is an important phase in the software development process. Performance regression testing is very time consuming yet there is usually little time assigned for it. A typical test run would output thousands of performance counters. Testers usually have to manually inspect these counters to identify performance regressions. In this paper, we propose an approach to analyze performance counters across test runs using a statistical process control technique called control charts. We evaluate our approach using historical data of a large software team as well as an open-source software project. The results show that our approach can accurately identify performance regressions in both software systems. Feedback from practitioners is very promising due to the simplicity and ease of explanation of the results.

References

  1. A. Avritzer and E. R. Weyuker. The automatic generation of load test suites and the assessment of the resulting software. IEEE Transactions on Software Engineering (TSE), 21(9):705--716, 1995. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. M. Y. Chen, E. Kiciman, E. Fratkin, A. Fox, and E. Brewer. Pinpoint: Problem determination in large, dynamic internet services. In International Conference on Dependable Systems and Networks (DSN), pages 595--604, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. L. Cherkasova, K. Ozonat, M. Ningfang, J. Symons, and E. Smirni. Anomaly? application change? or workload change? towards automated detection of application performance anomaly and change. In International Conference on Dependable Systems and Networks (DSN), pages 452--461, 2008.Google ScholarGoogle ScholarCross RefCross Ref
  4. I. Cohen, M. Goldszmidt, T. Kelly, J. Symons, and J. S. Chase. Correlating instrumentation data to system states: a building block for automated diagnosis and control. In Symposium on Opearting Systems Design Implementation, pages 231--244, San Francisco, CA, 2004. USENIX Association. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Dell Inc. DVD Store Test Application, 2010. Ver. 2.1.Google ScholarGoogle Scholar
  6. K. C. Foo. Automated discovery of performance regressions in enterprise applications. Master's thesis, 2011.Google ScholarGoogle Scholar
  7. K. C. Foo, J. Zhen Ming, B. Adams, A. E. Hassan, Z. Ying, and P. Flora. Mining performance regression testing repositories for automated performance analysis. In International Conference on Quality Software (QSIC), pages 32--41, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. H. W. Gunther. Websphere application server development best practices for performance and scalability. IBM WebSphere Application Server Standard and Advanced Editions - White paper, 2000.Google ScholarGoogle Scholar
  9. Hewlett Packard. Loadrunner, 2010.Google ScholarGoogle Scholar
  10. M. Jiang, M. A. Munawar, T. Reidemeister, and P. A. S. Ward. Automatic fault detection and diagnosis in complex software systems by information-theoretic monitoring. In International Conference on Dependable Systems Networks (DSN), pages 285--294, 2009.Google ScholarGoogle ScholarCross RefCross Ref
  11. Z. M. Jiang, A. E. Hassan, G. Hamann, and P. Flora. Automatic identification of load testing problems. In International Conference on Software Maintenance (ICSM), pages 307--316, 2008.Google ScholarGoogle ScholarCross RefCross Ref
  12. Z. M. Jiang, A. E. Hassan, G. Hamann, and P. Flora. Automatic performance analysis of load tests. In International Conference in Software Maintenance (ICSM), pages 125--134, Edmonton, 2009.Google ScholarGoogle Scholar
  13. H. Malik. A methodology to support load test analysis. In International Conference on Software Engineering (ICSE), pages 421--424, Cape Town, South Africa, 2010. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Microsoft Corp. Windows reliability and performance monitor, 2011.Google ScholarGoogle Scholar
  15. MySQL AB. Mysql community server, 2011. Ver. 5.5.Google ScholarGoogle Scholar
  16. W. Shewhart. Economic Control of Quality of Manufactured Product. American Society for Quality Control, 1931.Google ScholarGoogle Scholar
  17. The Apache Software Foundation. Tomcat, 2010. Ver. 5.5.Google ScholarGoogle Scholar
  18. I. Trubin. Capturing workload pathology by statistical exception detection system. In Computer Measurement Group (CMG), 2005.Google ScholarGoogle Scholar
  19. E. J. Weyuker and F. I. Vokolos. Experience with performance testing of software systems: Issues, an approach, and case study. IEEE Transactions on Software Engineering (TSE), 26(12):1147--1156, 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Automated detection of performance regressions using statistical process control techniques

          Recommendations

          Comments

          Login options

          Check if you have access through your login credentials or your institution to get full access on this article.

          Sign in
          • Published in

            cover image ACM Conferences
            ICPE '12: Proceedings of the 3rd ACM/SPEC International Conference on Performance Engineering
            April 2012
            362 pages
            ISBN:9781450312028
            DOI:10.1145/2188286

            Copyright © 2012 ACM

            Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

            Publisher

            Association for Computing Machinery

            New York, NY, United States

            Publication History

            • Published: 22 April 2012

            Permissions

            Request permissions about this article.

            Request Permissions

            Check for updates

            Qualifiers

            • research-article

            Acceptance Rates

            Overall Acceptance Rate252of851submissions,30%

          PDF Format

          View or Download as a PDF file.

          PDF

          eReader

          View online with eReader.

          eReader