skip to main content
10.1145/3315568.3329965acmconferencesArticle/Chapter ViewAbstractPublication PagespldiConference Proceedingsconference-collections
research-article

Know your analysis: how instrumentation aids understanding static analysis

Published:22 June 2019Publication History

ABSTRACT

The development of a high-quality data-flow analysis---one that is precise and scalable---is a challenging task. A concrete client analysis not only requires data-flow but, in addition, type-hierarchy, points-to, and call-graph information, all of which need to be obtained by wisely chosen and correctly parameterized algorithms. Therefore, many static analysis frameworks have been developed that provide analysis writers with generic data-flow solvers as well as those additional pieces of information. Such frameworks ease the development of an analysis by requiring only a description of the data-flow problem to be solved and a set of framework parameters. Yet, analysis writers often struggle when an analysis does not behave as expected on real-world code. It is usually not apparent what causes a failure due to the complex interplay of the several algorithms and the client analysis code within such frameworks. In this work, we present some of the insights we gained by instrumenting the LLVM-based static analysis framework PhASAR for C/C++ code and show the broad area of applications at which flexible instrumentation supports analysis and framework developers. We present five cases in which instrumentation gave us valuable insights to debug and improve both, the concrete analyses and the underlying PhASAR framework.

References

  1. 2019. coreutils. Retrieved 04/02/2019 from https://www.gnu.org/ software/coreutils/coreutils.htmlGoogle ScholarGoogle Scholar
  2. 2019. DroidBench. Retrieved 04/02/2019 from https://github.com/ secure-software-engineering/DroidBenchGoogle ScholarGoogle Scholar
  3. 2019. SecuriBench. Retrieved 04/02/2019 from https://suif.stanford. edu/~livshits/work/securibench/intro.htmlGoogle ScholarGoogle Scholar
  4. Eric Bodden. 2018. The Secret Sauce in Efficient and Precise Static Analysis: The Beauty of Distributive, Summary-based Static Analyses (and How to Master Them). In Companion Proceedings for the ISSTA/E-COOP 2018 Workshops (ISSTA ’18). ACM, New York, NY, USA, 85–93.Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Martin Bravenboer and Yannis Smaragdakis. 2009. Strictly Declarative Specification of Sophisticated Points-to Analyses. In Proceedings of the 24th ACM SIGPLAN Conference on Object Oriented Programming Systems Languages and Applications (OOPSLA ’09). ACM, New York, NY, USA, 243–262. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Michael Eichberg and Ben Hermann. 2014. A Software Product Line for Static Analyses: The OPAL Framework. In Proceedings of the 3rd ACM SIGPLAN International Workshop on the State of the Art in Java Program Analysis (SOAP ’14). ACM, New York, NY, USA, 1–6. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Herbert Jordan, Bernhard Scholz, and Pavle Subotić. 2016. Soufflé: On synthesis of program analyzers. In International Conference on Computer Aided Verification. Springer, 422–430.Google ScholarGoogle ScholarCross RefCross Ref
  8. Patrick Lam, Eric Bodden, Ondvrej Lhoták, and Laurie Hendren. 2011. The Soot framework for Java program analysis: a retrospective.Google ScholarGoogle Scholar
  9. Johannes Lerch and Ben Hermann. 2015. Design Your Analysis: A Case Study on Implementation Reusability of Data-flow Functions. In Proceedings of the 4th ACM SIGPLAN International Workshop on State Of the Art in Program Analysis (SOAP 2015). ACM, New York, NY, USA, 26–30.Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Nicholas Nethercote and Julian Seward. 2007. Valgrind: A Framework for Heavyweight Dynamic Binary Instrumentation. In Proceedings of the 28th ACM SIGPLAN Conference on Programming Language Design and Implementation (PLDI ’07). ACM, New York, NY, USA, 89–100. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Lisa Nguyen, Stefan Krüger, Patrick Hill, Karim Ali, and Eric Bodden. 2018. VISUFLOW, a Debugging Environment for Static Analyses. In International Conference for Software Engineering (ICSE), Tool Demonstrations Track. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Thomas Reps, Susan Horwitz, and Mooly Sagiv. 1995. Precise Interprocedural Dataflow Analysis via Graph Reachability. In Proceedings of the 22Nd ACM SIGPLAN-SIGACT Symposium on Principles of Programming Languages (POPL ’95). ACM, New York, NY, USA, 49–61. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Thomas Reps, Stefan Schwoon, and Somesh Jha. 2003. Weighted Pushdown Systems and Their Application to Interprocedural Dataflow Analysis. In Proceedings of the 10th International Conference on Static Analysis (SAS’03). Springer-Verlag, Berlin, Heidelberg, 189–213. http: //dl.acm.org/citation.cfm?id=1760267.1760283 Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Mooly Sagiv, Thomas Reps, and Susan Horwitz. 1996. Precise Interprocedural Dataflow Analysis with Applications to Constant Propagation. Theor. Comput. Sci. 167, 1-2 (Oct. 1996), 131–170. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Philipp Dominik Schubert, Ben Hermann, and Eric Bodden. 2019. PhASAR: An Inter-procedural Static Analysis Framework for C/C++. In Tools and Algorithms for the Construction and Analysis of Systems, Tomáš Vojnar and Lijun Zhang (Eds.). Springer International Publishing, Cham, 393–410.Google ScholarGoogle Scholar
  16. Shinichi Shiraishi, Veena Mohan, and Hemalatha Marimuthu. 2015. Test Suites for Benchmarks of Static Analysis Tools. In Proceedings of the 2015 IEEE International Symposium on Software Reliability Engineering Workshops (ISSREW) (ISSREW ’15). IEEE Computer Society, Washington, DC, USA, 12–15. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Asia Slowinska and Herbert Bos. 2009. Pointless Tainting: Evaluating the Practicality of Pointer Tainting. In Proceedings of the 4th ACM European Conference on Computer Systems (EuroSys ’09). ACM, New York, NY, USA, 61–74. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. John Toman and Dan Grossman. 2017. Taming the Static Analysis Beast. In 2nd Summit on Advances in Programming Languages (SNAPL 2017) (Leibniz International Proceedings in Informatics (LIPIcs)), Benjamin S. Lerner, Rastislav Bodík, and Shriram Krishnamurthi (Eds.), Vol. 71. Schloss Dagstuhl–Leibniz-Zentrum fuer Informatik, Dagstuhl, Germany, 18:1–18:14.Google ScholarGoogle Scholar
  19. WALA 2019. WALA. Retrieved 04/02/2019 from http://wala. sourceforge.net/wiki/index.php/Main_PageGoogle ScholarGoogle Scholar

Index Terms

  1. Know your analysis: how instrumentation aids understanding static analysis

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      SOAP 2019: Proceedings of the 8th ACM SIGPLAN International Workshop on State Of the Art in Program Analysis
      June 2019
      43 pages
      ISBN:9781450367202
      DOI:10.1145/3315568

      Copyright © 2019 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 22 June 2019

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate11of11submissions,100%

      Upcoming Conference

      PLDI '24

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader