skip to main content
10.1145/3511430.3511439acmotherconferencesArticle/Chapter ViewAbstractPublication PagesisecConference Proceedingsconference-collections
research-article

Reproducibility Challenges and Their Impacts on Technical Q&A Websites: The Practitioners’ Perspectives

Authors Info & Claims
Published:24 February 2022Publication History

ABSTRACT

Software developers often submit questions to technical Q&A sites like Stack Overflow (SO) to resolve their code-level problems. Usually, they include example code segments with their questions to explain the programming issues. When users of SO attempt to answer the questions, they prefer to reproduce the issues reported in questions using the given code segments. However, such code segments could not always reproduce the issues due to several unmet challenges (e.g., too short code segment) that might prevent questions from receiving appropriate and prompt solutions. A previous study produced a catalog of potential challenges that hinder the reproducibility of issues reported at SO questions. However, it is unknown how the practitioners (i.e., developers) perceive the challenge catalog. Understanding the developers’ perspective is inevitable to introduce interactive tool support that promotes reproducibility. We thus attempt to understand developers’ perspectives by surveying 53 users of SO. In particular, we attempt to – (1) see developers’ viewpoints on the agreement to those challenges, (2) find the potential impact of those challenges, (3) see how developers address them, and (4) determine and prioritize tool support needs. Survey results show that about 90% of participants agree to the already exposed challenges. However, they report some additional challenges (e.g., error log missing) that might prevent reproducibility. According to the participants, too short code segment and absence of required Class/Interface/Method from code segments severely prevent reproducibility, followed by missing important part of code. To promote reproducibility, participants strongly recommend introducing tool support that interacts with question submitters with suggestions for improving the code segments if the given code segments fail to reproduce the issues.

References

  1. B. CD Anda, D. IK Sjøberg, and A. Mockus. 2008. Variability and reproducibility in software engineering: A study of four companies that developed the same system. TSE (2008).Google ScholarGoogle Scholar
  2. Anonymous. 2021. Replication Package. https://bit.ly/3nOS2KMGoogle ScholarGoogle Scholar
  3. Tingting Bi, Xin Xia, David Lo, John Grundy, Thomas Zimmermann, and Denae Ford. 2021. Accessibility in Software Practice: A Practitioner’s Perspective. arXiv preprint arXiv:2103.08778(2021).Google ScholarGoogle Scholar
  4. R. PL Buse and W. R. Weimer. 2008. A metric for software readability. In Proc. ISSTA.Google ScholarGoogle Scholar
  5. R. PL Buse and W. R. Weimer. 2009. Learning a metric for code readability. TSE (2009).Google ScholarGoogle Scholar
  6. E. Daka, J. Campos, G. Fraser, J. Dorn, and W. Weimer. 2015. Modeling readability to improve unit tests. In Proc. FSE.Google ScholarGoogle Scholar
  7. F. Ebert, F. Castor, N. Novielli, and A. Serebrenik. 2019. Confusion in code reviews: Reasons, impacts, and coping strategies. In Proc. SANER.Google ScholarGoogle Scholar
  8. M. Erfani J., M. Mirzaaghaei, and A. Mesbah. 2014. Works for me! characterizing non-reproducible bug reports. In Proc. MSR.Google ScholarGoogle Scholar
  9. M. Fazzini, M. Prammer, M. d’Amorim, and A. Orso. 2018. Automatically translating bug reports into test cases for mobile apps. In Proc. ISSTA.Google ScholarGoogle Scholar
  10. D. Ford, K. Lustig, J. Banks, and C. Parnin. 2018. ” We Don’t Do That Here” How Collaborative Editing with Mentors Improves Engagement in Social Q&A Communities. In Proc. CHI.Google ScholarGoogle Scholar
  11. Z. Gao, X. Xia, D. Lo, J. Grundy, and Y. F. Li. 2020. Code2Que: A Tool for Improving Question Titles from Mined Code Snippets in Stack Overflow. arXiv preprint arXiv:2007.10851(2020).Google ScholarGoogle Scholar
  12. R. M. Groves, Floyd J. Fowler J., M. P. Couper, J. M. Lepkowski, E. Singer, and R. Tourangeau. 2011. Survey methodology.Google ScholarGoogle Scholar
  13. E. Horton and C. Parnin. 2018. Gistable: Evaluating the executability of python code snippets on github. In Proc. ICSME.Google ScholarGoogle Scholar
  14. E. Horton and C. Parnin. 2019. Dockerizeme: Automatic inference of environment dependencies for python code snippets. In Proc. ICSE.Google ScholarGoogle Scholar
  15. A. Joshi, S. Kale, S. Chandel, and D. K. Pal. 2015. Likert scale: Explored and explained. CJAST (2015).Google ScholarGoogle Scholar
  16. B. A. Kitchenham and S. L. Pfleeger. 2008. Personal opinion surveys. In Guide to advanced empirical software engineering.Google ScholarGoogle Scholar
  17. J. C. Lin and K. C. Wu. 2008. Evaluation of software understandability based on fuzzy matrix. In Proc. FUZZ.Google ScholarGoogle Scholar
  18. S. Mondal, M. M. Rahman, and C. K. Roy. 2019. Can issues reported at stack overflow questions be reproduced?: an exploratory study. In Proc. MSR.Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Saikat Mondal and Banani Roy. 2021. Reproducibility Challenges and Their Impacts on Technical Q&A Websites: The Practitioners’ Perspectives. arXiv preprint arXiv:2112.10056(2021).Google ScholarGoogle Scholar
  20. K. Moran, M. Linares-Vásquez, C. Bernal-Cárdenas, C. Vendome, and D. Poshyvanyk. 2016. Automatically discovering, reporting and reproducing android application crashes. In Proc. ICST.Google ScholarGoogle Scholar
  21. D. Mu, A. Cuevas, L. Yang, H. Hu, X. Xing, B. Mao, and G. Wang. 2018. Understanding the reproducibility of crowd-reported security vulnerabilities. In Proc. USENIX Security.Google ScholarGoogle Scholar
  22. Stack Overflow. 2009. Java: Resetting all values in the program. https://stackoverflow.com/questions/798184 Online; Last accessed 10 January 2020.Google ScholarGoogle Scholar
  23. Stack Overflow. 2009. Use Sounds in java?https://stackoverflow.com/questions/1264770 Online; Last accessed 10 January 2020.Google ScholarGoogle Scholar
  24. Stack Overflow. 2010. get resource is null. https://stackoverflow.com/questions/2012643 Online; Last accessed 10 January 2020.Google ScholarGoogle Scholar
  25. Stack Overflow. 2010. Getting Error On Xml resultSet in java. https://stackoverflow.com/questions/2018247 Online; Last accessed 10 January 2020.Google ScholarGoogle Scholar
  26. Stack Overflow. 2013. Scanner doesn’t see after space. https://stackoverflow.com/questions/19509647 Online; Last accessed 10 January 2020.Google ScholarGoogle Scholar
  27. D. Posnett, A. Hindle, and P. Devanbu. 2011. A simpler model of software readability. In Proc. MSR.Google ScholarGoogle Scholar
  28. Chaiyong Ragkhitwetsagul, Jens Krinke, Matheus Paixao, Giuseppe Bianco, and Rocco Oliveto. 2019. Toxic code snippets on stack overflow. IEEE Transactions on Software Engineering(2019).Google ScholarGoogle ScholarCross RefCross Ref
  29. M. M. Rahman, F. Khomh, and M. Castelluccio. 2020. Why are Some Bugs Non-Reproducible? An Empirical Investigation using Data Fusion. In Proc. ICSME.Google ScholarGoogle Scholar
  30. S. Scalabrino, G. Bavota, C. Vendome, M. Linares-Vásquez, D. Poshyvanyk, and R. Oliveto. 2017. Automatically assessing code understandability: How far are we?. In Proc. ASE.Google ScholarGoogle Scholar
  31. S. Scalabrino, M. Linares-Vasquez, D. Poshyvanyk, and R. Oliveto. 2016. Improving code readability models with textual features. In Proc. ICPC.Google ScholarGoogle Scholar
  32. J. Singer and N. G. Vinson. 2002. Ethical issues in empirical studies of software engineering. TSE (2002).Google ScholarGoogle Scholar
  33. M. Soltani, A. Panichella, and A. Van D.2017. A guided genetic algorithm for automated crash reproduction. In Proc. ICSE.Google ScholarGoogle Scholar
  34. M. Tahaei, K. Vaniea, and N. Saphra. 2020. Understanding Privacy-Related Questions on Stack Overflow. In Proc. CHI.Google ScholarGoogle Scholar
  35. V. Terragni, Y. Liu, and S. C. Cheung. 2016. CSNIPPEX: automated synthesis of compilable code snippets from Q&A sites. In Proc. ISSTA.Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. Y. Tian, D. Lo, and J. Lawall. 2014. Automated construction of a software-specific word similarity database. In Proc. CSMR-WCRE.Google ScholarGoogle Scholar
  37. C. Treude, O. Barzilay, and M. A. Storey. 2011. How do programmers ask and answer questions on the web?(NIER track). In Proc. ICSE.Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. C. Treude and M. P. Robillard. 2017. Understanding stack overflow code fragments. In Proc. ICSME.Google ScholarGoogle ScholarCross RefCross Ref
  39. A. Trockman, K. Cates, M. Mozina, T. Nguyen, C. Kästner, and B. Vasilescu. 2018. ” Automatically assessing code understandability” reanalyzed: combined metrics matter. In Proc. MSR.Google ScholarGoogle Scholar
  40. W. M. Vagias. 2006. Likert-type scale response anchors. Clemson International Institute for Tourism & Research Development, Department of Parks, Recreation and Tourism Management. Clemson University (2006).Google ScholarGoogle Scholar
  41. N. Vincent, I. Johnson, and B. Hecht. 2018. Examining Wikipedia with a broader lens: Quantifying the value of Wikipedia’s relationships with other large-scale online communities. In Proc. CHI.Google ScholarGoogle Scholar
  42. M. White, M. Linares-Vásquez, P. Johnson, C. Bernal-Cárdenas, and D. Poshyvanyk. 2015. Generating reproducible and replayable bug reports from android application crashes. In Proc. ICPC.Google ScholarGoogle Scholar
  43. A. Yamashita and L. Moonen. 2013. Do developers care about code smells? An exploratory survey. In Proc. WCRE.Google ScholarGoogle Scholar
  44. D. Yang, A. Hussain, and C. V. Lopes. 2016. From query to usable code: an analysis of stack overflow code snippets. In Proc. MSR.Google ScholarGoogle ScholarDigital LibraryDigital Library
  45. Haoxiang Zhang, Shaowei Wang, Tse-Hsun Peter Chen, Ying Zou, and Ahmed E Hassan. 2019. An empirical study of obsolete answers on Stack Overflow. IEEE Transactions on Software Engineering(2019).Google ScholarGoogle Scholar

Index Terms

  1. Reproducibility Challenges and Their Impacts on Technical Q&A Websites: The Practitioners’ Perspectives
          Index terms have been assigned to the content through auto-classification.

          Recommendations

          Comments

          Login options

          Check if you have access through your login credentials or your institution to get full access on this article.

          Sign in
          • Published in

            cover image ACM Other conferences
            ISEC '22: Proceedings of the 15th Innovations in Software Engineering Conference
            February 2022
            235 pages
            ISBN:9781450396189
            DOI:10.1145/3511430

            Copyright © 2022 ACM

            Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

            Publisher

            Association for Computing Machinery

            New York, NY, United States

            Publication History

            • Published: 24 February 2022

            Permissions

            Request permissions about this article.

            Request Permissions

            Check for updates

            Qualifiers

            • research-article
            • Research
            • Refereed limited

            Acceptance Rates

            Overall Acceptance Rate76of315submissions,24%

          PDF Format

          View or Download as a PDF file.

          PDF

          eReader

          View online with eReader.

          eReader

          HTML Format

          View this article in HTML Format .

          View HTML Format