skip to main content
10.1145/2384616.2384666acmconferencesArticle/Chapter ViewAbstractPublication PagessplashConference Proceedingsconference-collections
research-article

An empirical study of the influence of static type systems on the usability of undocumented software

Published:19 October 2012Publication History

ABSTRACT

Abstract Although the study of static and dynamic type systems plays a major role in research, relatively little is known about the impact of type systems on software development. Perhaps one of the more common arguments for static type systems in languages such as Java or C++ is that they require developers to annotate their code with type names, which is thus claimed to improve the documentation of software. In contrast, one common argument against static type systems is that they decrease flexibility, which may make them harder to use. While these arguments are found in the literature, rigorous empirical evidence is lacking. We report on a controlled experiment where 27 subjects performed programming tasks on an undocumented API with a static type system (requiring type annotations) as well as a dynamic type system (which does not). Our results show that for some tasks, programmers had faster completion times using a static type system, while for others, the opposite held. We conduct an exploratory study to try and theorize why.

References

  1. Bird, R., and Wadler, P. An introduction to functional programming. Prentice Hall International (UK) Ltd., Hertfordshire, UK, UK, 1988. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Bortz, J. Statistik: für Human- und Sozialwissenschaftler, 6., vollst. überarb. u. aktualisierte aufl. ed. Springer, September 2005.Google ScholarGoogle Scholar
  3. Brooks, R. E. Studying programmer behavior experimentally: the problems of proper methodology. Commun. ACM 23 (April 1980), 207--213. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Bruce, K. B. Foundations of object-oriented languages: types and semantics. MIT Press, Cambridge, MA, USA, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Curtis, B. Five paradigms in the psychology of programming. In Handbook of Human-Computer Interaction, M. Helander, Ed. Elsevier (North-Holland), 1988, pp. 87--106.Google ScholarGoogle ScholarCross RefCross Ref
  6. Daly, M. T., Sazawal, V., and Foster, J. S. Work in progress: an empirical study of static typing in ruby. Workshop on Evaluation and Usability of Programming Languages and Tools (PLATEAU),Orlando, Florida, October 2009 (2009).Google ScholarGoogle Scholar
  7. Fenton, N. E., and Pfleeger, S. L. Software Metrics: A Rigorous and Practical Approach. PWS Publishing Co., Boston, MA, USA, 1998. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Gannon, J. D. An experimental evaluation of data type conventions. Commun. ACM 20, 8 (1977), 584--595. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Gravetter, F., and Wallnau, L. Statistics for the Behavioral Sciences. Cengage Learning, 2008.Google ScholarGoogle Scholar
  10. Hanenberg, S. An experiment about static and dynamic type systems: Doubts about the positive impact of static type systems on development time. In Proceedings of the ACM international conference on Object oriented programming systems languages and applications (New York, NY, USA, 2010), OOPSLA '10, ACM, pp. 22--35. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Hanenberg, S. Faith, hope, and love: An essay on software science's neglect of human factors. In Proceedings of the ACM international conference on Object oriented programming systems languages and applications (Reno/Tahoe, Nevada, USA, October 2010), OOPSLA '10, pp. 933--946. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Hanenberg, S. A chronological experience report from an initial experiment series on static type systems. In 2nd Workshop on Empirical Evaluation of Software Composition Techniques (ESCOT) (Lancaster, UK, 2011).Google ScholarGoogle Scholar
  13. Juristo, N., and Moreno, A. M. Basics of Software Engineering Experimentation. Springer, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Kawrykow, D., and Robillard, M. P. Improving API usage through automatic detection of redundant code. In ASE 2009, 24th IEEE/ACM International Conference on Automated Software Engineering, Auckland, New Zealand, November 16--20, 2009 (2009), IEEE Computer Society, pp. 111--122. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Kleinschmager, S., Hanenberg, S., Robbes, R., Tanter, É., and Stefik, A. Do static type systems improve the maintainability of software systems? An empirical study. In IEEE 20th International Conference on Program Comprehension, ICPC 2012, Passau, Germany, June 11--13, 2012 (2012), pp. 153--162.Google ScholarGoogle ScholarCross RefCross Ref
  16. Koenig, D., Glover, A., King, P., Laforge, G., and Skeet, J. Groovy in Action. Manning Publications Co., Greenwich, CT, USA, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. McConnell, S. What does 10x mean? Measuring variations in programmer productivity. In Making Software: What Really Works, and Why We Believe It, A. Oram and G. Wilson, Eds., O'Reilly Series. O'Reilly Media, 2010, pp. 567--575.Google ScholarGoogle Scholar
  18. Pierce, B. C. Types and programming languages. MIT Press, Cambridge, MA, USA, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Pottier, F., and Rémy, D. The essence of ML type inference. In Advanced Topics in Types and Programming Languages, B. C. Pierce, Ed. MIT Press, 2005, ch. 10, pp. 389--489.Google ScholarGoogle Scholar
  20. Prechelt, L. An empirical comparison of seven programming languages. IEEE Computer 33 (2000), 23--29. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Prechelt, L. Kontrollierte Experimente in der Softwaretechnik. Springer, Berlin, March 2001.Google ScholarGoogle ScholarCross RefCross Ref
  22. Prechelt, L., and Tichy, W. F. A controlled experiment to assess the benefits of procedure argument type checking. IEEE Trans. Softw. Eng. 24, 4 (1998), 302--312. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Rice, J. A. Mathematical Statistics and Data Analysis. Duxbury Press, Apr. 2001.Google ScholarGoogle Scholar
  24. Robillard, M. P. What makes APIs hard to learn? answers from developers. IEEE Software 26, 6 (2009), 27--34. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Shneiderman, B. Software Psychology: Human Factors in Computer and Information Systems. Winthrop Publishers, August 1980. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Siek, J. G., and Taha, W. Gradual typing for objects. In ECOOP 2007 - Object-Oriented Programming, 21st European Conference, Berlin, Germany, July 30 - August 3, 2007, Proceedings (2007), vol. 4609 of Lecture Notes in Computer Science, Springer, pp. 2--27. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Steinberg, M., and Hanenberg, S. What is the impact of static type systems on debugging type errors and semantic errors? An empirical study of differences in debugging time using statically and dynamically typed languages - yet published work.Google ScholarGoogle Scholar
  28. Stuchlik, A., and Hanenberg, S. Static vs. dynamic type systems: An empirical study about the relationship between type casts and development time. In Proceedings of the 7th symposium on Dynamic languages (Portland, Oregon, USA, 2011), DLS '11, ACM, pp. 97--106. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Wohlin, C., Runeson, P., Höst, M., Ohlsson, M. C., Regnell, B., and Wesslén, A. Experimentation in software engineering: an introduction. Kluwer Academic Publishers, Norwell, MA, USA, 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. An empirical study of the influence of static type systems on the usability of undocumented software

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      OOPSLA '12: Proceedings of the ACM international conference on Object oriented programming systems languages and applications
      October 2012
      1052 pages
      ISBN:9781450315616
      DOI:10.1145/2384616
      • cover image ACM SIGPLAN Notices
        ACM SIGPLAN Notices  Volume 47, Issue 10
        OOPSLA '12
        October 2012
        1011 pages
        ISSN:0362-1340
        EISSN:1558-1160
        DOI:10.1145/2398857
        Issue’s Table of Contents

      Copyright © 2012 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 19 October 2012

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate268of1,244submissions,22%

      Upcoming Conference

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader