Skip to main content
Log in

A Systematic Literature Review on Empirical Analysis of the Relationship Between Code Smells and Software Quality Attributes

  • Original Paper
  • Published:
Archives of Computational Methods in Engineering Aims and scope Submit manuscript

Abstract

Code smells indicate problems in design or code which makes software hard to change and maintain. It has become a sign of software systems that cause complications in maintaining software quality. The detection of harmful code smells which deteriorate the software quality has resulted in a favourable shift in interest among researchers. Therefore, a significant research towards analysing the impact of code smells on software quality has been conducted over the last few years. This study aims at reporting a systematic literature review of such existing empirical studies investigate the impact of code smells on software quality attributes. The results indicate that the impact of code smells on software quality is not uniform as different code smells have the opposite effect on different software quality attributes. The findings of this review will provide the awareness to the researchers and a practitioner regarding the impact of code smells on software quality. It would be more advantageous to conduct further studies that consider less explored code smells, least or not investigated quality attributes, involve industry researchers and use large commercial software systems.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14

Similar content being viewed by others

Notes

  1. https://www.bugzilla.org/.

  2. https://www.atlassian.com/software/jira.

  3. http://trac.edgewall.org.

References

  1. Booch G, Maksimchuk RA, Engle MW, Young BJ, Conallen J, Houston KA (2006) Object-oriented analysis and design with applications, 3rd edn. Addison-Wesley, Upper Saddle River

    Google Scholar 

  2. Tufano M, Palomba F, Bavota G, Oliveto R, Di Penta M, De Lucia A, Poshyvanyk D (2015, May) When and why your code starts to smell bad. In: 37th IEEE international conference on software engineering (ICSE). IEEE, vol 1, pp 403–414

  3. Fontana FA, Braione P, Zanoni M (2012) Automatic detection of bad smells in code: an experimental assessment. J Object Technol 11(2):5–13

    Google Scholar 

  4. Abdelmoez W, Kosba E, Iesa AF (2014, January) Risk-based code smells detection tool. In: The international conference on computing technology and information management (ICCTIM). Society of Digital Information and Wireless Communication, pp 148–159

  5. Opdyke WF (1992) Refactoring object-oriented frameworks. Ph.D. Thesis. University of Illinois at Urbana-Champaign Illinois

  6. Mathur N (2011) Java smell detector, Master’s projects. San Jose State University 173:1–127

    Google Scholar 

  7. Mantyla M, Vanhanen J, Lassenius C (2003, September) A taxonomy and an initial empirical study of bad smells in code. In: Proceedings of international conference on software maintenance (ICSM). IEEE, pp 381–384

  8. Fowler M, Beck K (1999) Refactoring: improving the design of existing code. Addison-Wesley, Upper Saddle River

    Google Scholar 

  9. Mäntylä MV, Lassenius C (2006) Subjective evaluation of software evolvability using code smells: an empirical study. Empir Softw Eng 11(3):395–431

    Article  Google Scholar 

  10. Wake WC (2004) Refactoring workbook. Addison Wesley Longman Publishing Company, Boston

    Google Scholar 

  11. Rasool G, Arshad Z (2015) A review of code smell mining techniques. J Softw Evol Process 27(11):867–895

    Article  Google Scholar 

  12. Zhang M, Hall T, Baddoo N (2011) Code bad smells: a review of current knowledge. J Softw Evol Process 23(3):179–202

    Article  Google Scholar 

  13. Khomh F, Vaucher S, Guéhéneuc Y, Sahraoui H (2011) The journal of systems and software BDTEX: a GQM-based Bayesian approach for the detection of antipatterns. J Syst Softw 84(4):559–572

    Article  Google Scholar 

  14. Kumar S, Chhabra JK (2014) Two level dynamic approach for Feature Envy detection. In: 2014 International conference on computer and communication technology (ICCCT). IEEE, pp 41–46

  15. Rapu D, Ducasse S, Gîrba T, Marinescu R (2004, March) Using history information to improve design flaws detection. In: Proceedings of eighth European conference on software maintenance and reengineering (CSMR). IEEE, pp 223–232

  16. Ligu E, Chatzigeorgiou A, Chaikalis T, Ygeionomakis N (2013, September) Identification of refused bequest code smells. In: 29th international conference on software maintenance (ICSM). IEEE, pp 392–395

  17. Wangberg R (2010) A literature review on code smells and refactoring. Master’s thesis

  18. Gupta A, Suri B, Misra S (2017) A systematic literature review: code bad smells in java source code. In: International conference on computational science and its applications. Springer, pp 665–682

  19. Sharma T, Spinellis D (2017) A survey on software smells. J Syst Softw 138:158–173

    Article  Google Scholar 

  20. Haque MS, Carver J, Atkison T (2018, March) Causes, impacts, and detection approaches of code smell: a survey. In: Proceedings of the ACMSE 2018 conference. ACM, p 25

  21. Kitchenham B, Charters S (2007) Guidelines for performing systematic literature reviews in software engineering. In: Technical report, Ver. 2.3 EBSE

  22. Cohen J (1960) A coefficient of agreement for nominal scales. Education and Psychological Measurement 20:37–46

    Article  Google Scholar 

  23. Landis J, Koch G (1977) Measurement of observer agreement for categorical data. Biometrics 33:159–174

    Article  Google Scholar 

  24. http://www.gdeepak.com/project.php

  25. Morales R, McIntosh S, Khomh F (2015) Do code review practices impact design quality? a case study of the QT, VTK, and ITK projects. In: 22nd International conference on software analysis, evolution and reengineering (SANER). IEEE, pp 171–180

  26. Radjenović D, Heričko M, Torkar R, Živkovič A (2013) Software fault prediction metrics: a systematic literature review. Inf Softw Technol 55(8):1397–1418

    Article  Google Scholar 

  27. ISO/IEC-25010 (2010) Systems and software engineering–systems and software quality requirements and evaluation (SQuaRE)–system and software quality models. International Organization for Standardization

  28. Chidamber SR, Kemerer CF (1994) A metrics suite for object oriented design. IEEE Trans Softw Eng 20(6):476–493

    Article  Google Scholar 

  29. Krinke J (2007, October) A study of consistent and inconsistent changes to code clones. In: 14th Working conference on reverse engineering (WCRE). IEEE, pp 170–178

  30. Krinke J (2008, September) Is cloned code more stable than non-cloned code? In: Eighth IEEE international working conference on source code analysis and manipulation. IEEE, pp 57–66

  31. Krinke J (2011, May) Is cloned code older than non-cloned code? In: Proceedings of the 5th international workshop on software clones. ACM, pp 28–33

  32. Lozano A, Wermelinger M (2010, May) Tracking clones’ imprint. In: Proceedings of the 4th international workshop on software clones. ACM, pp 65–72

  33. Hotta K, Sano Y, Higo Y, Kusumoto S (2010, September) Is duplicate code more frequently modified than non-duplicate code in software evolution? an empirical study on open source software. In: Proceedings of the joint ERCIM workshop on software evolution (EVOL) and international workshop on principles of software evolution (IWPSE). ACM, pp 73–82

  34. Saha RK, Asaduzzaman M, Zibran MF, Roy CK, Schneider KA (2010, September) Evaluating code clone genealogies at release level: an empirical study. In: 10th IEEE working conference on source code analysis and manipulation (SCAM). IEEE, pp 87–96

  35. Abreu FB, Goulão M, Esteves R (1995, October) Toward the design quality evaluation of object-oriented software systems. In: Proceedings of the 5th international conference on software quality, Austin, Texas, USA, pp 44–57

  36. Abreu FB, Melo W (1996, March) Evaluating the impact of object-oriented design on software quality. In: Proceedings of the 3rd international software metrics symposium. IEEE, pp 90–99

  37. Lorenz M, Kidd J (1994) Object-oriented software metrics: a practical guide. Prentice-Hall, Upper Saddle River

    Google Scholar 

  38. Buse RP, Weimer WR (2010) Learning a metric for code readability. IEEE Trans Softw Eng 36(4):546–558

    Article  Google Scholar 

  39. Jabangwe R, Börstler J, Šmite D, Wohlin C (2015) Empirical evidence on the link between object-oriented measures and external quality attributes: a systematic literature review. Empir Softw Eng 20(3):640–693

    Article  Google Scholar 

  40. Moha N, Gueheneuc YG, Duchien L, Le Meur AF (2010) DECOR: a method for the specification and detection of code and design smells. IEEE Trans Softw Eng 36(1):20–36

    Article  Google Scholar 

  41. Runeson P, Host M, Rainer A, Regnell B (2012) Case study research in software engineering: guidelines and examples. Wiley, New York

    Book  Google Scholar 

Primary Studies References

  • [S1] Deligiannis, I., Stamelos, I., Angelis, L., Roumeliotis, M., & Shepperd, M. (2004). A controlled experiment investigation of an object-oriented design heuristic for maintainability. Journal of Systems and Software72(2), 129–143

  • [S2] Bavota, G., & Russo, B. (2016, May). A large-scale empirical study on self-admitted technical debt. In Proceedings of the 13th International Conference on Mining Software Repositories, ACM, 315–326.

  • [S3] Wagey, B. C., Hendradjaya, B., & Mardiyanto, M. S. (2015, November). A proposal of software maintainability model using code smell measurement. InInternational Conference on Data and Software Engineering (ICoDSE), IEEE, 25–30

  • [S4] Miyake, Y., Amasaki, S., Aman, H., & Yokogawa, T. (2017). A Replicated Study on Relationship Between Code Quality and Method Comments. In Applied Computing and Information Technology,Springer, 17–30

  • [S5] Sabané, A., Di Penta, M., Antoniol, G., & Guéhéneuc, Y. G. (2013, March). A study on the relation between antipatterns and the cost of class unit testing. In 17th European Conference on Software Maintenance and Reengineering (CSMR), IEEE, 167–176.

  • [S6] Aman, H. (2012, October). An empirical analysis on fault-proneness of well-commented modules. In Fourth International Workshop on Empirical Software Engineering in Practice (IWESEP), IEEE, 3–9

  • [S7] Deligiannis, I., Shepperd, M., Roumeliotis, M., & Stamelos, I. (2003). An empirical investigation of an object-oriented design heuristic for maintainability. Journal of Systems and Software65(2), 127–139

  • [S8] Kim, M., Sazawal, V., Notkin, D., & Murphy, G. (2005, September). An empirical study of code clone genealogies. In ACM SIGSOFT Software Engineering Notes, ACM, 30(5), 187–196

  • [S9] Li, W., & Shatnawi, R. (2007). An empirical study of the bad smells and class error probability in the post-release object-oriented system evolution. Journal of systems and software80(7), 1120–1128

  • [S10] Abbes, M., Khomh, F., Gueheneuc, Y. G., & Antoniol, G. (2011, March). An empirical study of the impact of two antipatterns, blob and spaghetti code, on program comprehension. In 15th European conference on Software maintenance and reengineering (CSMR), IEEE, 181–190

  • [S11] Mondal, M., Rahman, M. S., Saha, R. K., Roy, C. K., Krinke, J., & Schneider, K. A. (2011, June). An empirical study of the impacts of clones in software maintenance. In 19th International Conference on Program Comprehension (ICPC), IEEE, 242–245

  • [S12] Bettenburg, N., Shang, W., Ibrahim, W., Adams, B., Zou, Y., & Hassan, A. E. (2009, October). An empirical study on inconsistent changes to code clones at release level. In 16th Working Conference on Reverse Engineering (WCRE’09), IEEE, 85–94

  • [S13] Thummalapenta, S., Cerulo, L., Aversano, L., & Di Penta, M. (2010). An empirical study on the maintenance of source code clones. Empirical Software Engineering15(1), 1–34

  • [S14] Khomh, F., Di Penta, M., Guéhéneuc, Y. G., & Antoniol, G. (2012). An exploratory study of the impact of antipatterns on class change-and fault-proneness. Empirical Software Engineering17(3), 243–275

  • [S15] Khomh, F., Di Penta, M., & Gueheneuc, Y. G. (2009, October). An exploratory study of the impact of code smells on software change-proneness. In 16th Working Conference on Reverse Engineering, (WCRE’09), IEEE, 75–84

  • [S16] Romano, D., Raila, P., Pinzger, M., & Khomh, F. (2012, October). Analyzing the impact of antipatterns on change-proneness using fine-grained source code changes. In 19th Working Conference on Reverse Engineering (WCRE), IEEE, 437–446

  • [S17] Olbrich, S. M., Cruzes, D. S., & Sjøberg, D. I. (2010, September). Are all code smells harmful? A study of God Classes and Brain Classes in the evolution of three open source systems. In International Conference on Software Maintenance (ICSM), IEEE, 1–10

  • [S18] Marinescu, R., & Marinescu, C. (2011, September). Are the clients of flawed classes (also) defect prone?, In 11th IEEE International Working Conference on Source Code Analysis and Manipulation (SCAM), IEEE, 65–74

  • [S19] Marinescu, R. (2012). Assessing technical debt by identifying design flaws in software systems. IBM Journal of Research and Development56(5), 9–1

  • [S20] Lague, B., Proulx, D., Mayrand, J., Merlo, E. M., & Hudepohl, J. (1997, October). Assessing the benefits of incorporating function clone detection in a development process. In Proceedings of International Conference on Software Maintenance (ICSM), IEEE, 314–321

  • [S21] Yamashita, A. (2014). Assessing the capability of code smells to explain maintenance problems: an empirical study combining quantitative and qualitative data. Empirical Software Engineering19(4), 1111–1143

  • [S22] Lozano, A., & Wermelinger, M. (2008, September). Assessing the effect of clones on changeability. In International Conference on Software Maintenance (ICSM), IEEE,227–236

  • [S23] Rahman, F., Bird, C., & Devanbu, P. (2012). Clones: What is that smell?. Empirical Software Engineering17(4–5), 503–530

  • [S24] Kapser, C. J., & Godfrey, M. W. (2008). “Cloning considered harmful” considered harmful: patterns of cloning in software. Empirical Software Engineering13(6), 645

  • [S25] Danphitsanuphan, P., & Suwantada, T. (2012, May). Code smell detecting tool and code smell-structure bug relationship. In Spring Congress on Engineering and Technology (S-CET), IEEE, 1–5

  • [S26] Yamashita, A., & Counsell, S. (2013). Code smells as system-level indicators of maintainability: An empirical study. Journal of Systems and Software86(10), 2639–2653

  • [S27] Juergens, E., Deissenboeck, F., Hummel, B., & Wagner, S. (2009, May). Do code clones matter?. In 31st International Conference on Software Engineering (ICSE), IEEE, 485–495

  • [S28] Soh, Z., Yamashita, A., Khomh, F., & Guéhéneuc, Y. G. (2016, March). Do Code Smells Impact the Effort of Different Maintenance Programming Activities?. In 23rd International Conference on Software Analysis, Evolution, and Reengineering (SANER), IEEE, 1, 393–402

  • [S29] Yamashita, A., & Moonen, L. (2012, September). Do code smells reflect important maintainability aspects?. In 2012 28th IEEE international conference on software maintenance (ICSM) (pp. 306–315). IEEE

  • [S30] Palomba, F., Bavota, G., Di Penta, M., Oliveto, R., & De Lucia, A. (2014, September). Do they really smell bad? a study on developers’ perception of bad code smells. In international conference on Software maintenance and evolution (ICSME), IEEE, 101–110

  • [S31] Linares-Vásquez, M., Klock, S., McMillan, C., Sabané, A., Poshyvanyk, D., & Guéhéneuc, Y. G. (2014, June). Domain matters: bringing further evidence of the relationships among anti-patterns, application domains, and quality-related metrics in Java mobile apps. In Proceedings of the 22nd International Conference on Program Comprehension, 232–243

  • [S32] Chatterji, D., Carver, J. C., Kraft, N. A., & Harder, J. (2013, October). Effects of cloned code on software maintainability: A replicated developer study. In 20th Working Conference on Reverse Engineering (WCRE), IEEE, 112–121

  • [S33] Aman, H., Amasaki, S., Sasaki, T., & Kawahara, M. (2015, October). Empirical analysis of change-proneness in methods having local variables with long names and comments. In International Symposium on Empirical Software Engineering and Measurement (ESEM), ACM/IEEE, 1–4

  • [S34] Aman, H., Amasaki, S., Sasaki, T., & Kawahara, M. (2014, December). Empirical analysis of fault-proneness in methods by focusing on their comment lines. In 21st Asia-Pacific Software Engineering Conference (APSEC), IEEE, 2, 51–56

  • [S35] Jaafar, F., Guéhéneuc, Y. G., Hamel, S., Khomh, F., & Zulkernine, M. (2016). Evaluating the impact of design pattern and anti-pattern dependencies on changes and faults. Empirical Software Engineering21(3), 896–931

  • [S36] Wehaibi, S., Shihab, E., & Guerrouj, L. (2016, March). Examining the impact of self-admitted technical debt on software quality. In 23rd International Conference on Software Analysis, Evolution, and Reengineering (SANER), IEEE, 1, 179–188

  • [S37] Yamashita, A., & Moonen, L. (2013, May). Exploring the impact of inter-smell relations on software maintainability: An empirical study. In 35th International Conference on Software Engineering (ICSE),IEEE, 682–691

  • [S38] Yamashita, A. (2013, September). How Good Are Code Smells for Evaluating Software Maintainability? Results from a Comparative Case Study. In International Conference on Software Maintenance (ICSM), IEEE, 566–571

  • [S39] Fontana, F. A., Ferme, V., & Spinelli, S. (2012, June). Investigating the impact of code smells debt on quality code evaluation. In Third International Workshop on Managing Technical Debt (MTD), IEEE, 15–22

  • [S40] Fontana, F. A., Ferme, V., Marino, A., Walter, B., & Martenka, P. (2013, September). Investigating the impact of code smells on system’s quality: An empirical study on systems of different application domains. In 29th International Conference on Software Maintenance (ICSM), IEEE, 260–269

  • [S41] Zazworka, N., Shaw, M. A., Shull, F., & Seaman, C. (2011, May). Investigating the impact of design debt on software quality. In Proceedings of the 2nd Workshop on Managing Technical Debt, ACM, 17–23

  • [S42] Guerrouj, L., Kermansaravi, Z., Arnaoudova, V., Fung, B. C., Khomh, F., Antoniol, G., & Guéhéneuc, Y. G. (2015). Investigating the relation between lexical smells and change-and fault-proneness: an empirical study. Software Quality Journal, 1–30

  • [S43] Hirohisa, A. M. A. N., Amasaki, S., Sasaki, T., & Kawahara, M. (2015). Lines of comments as a noteworthy metric for analyzing fault-proneness in methods. IEICE TRANSACTIONS on Information and Systems98(12), 2218–2228

  • [S44] Aman, H., Amasaki, S., Yokogawa, T., & Kawahara, M. Local Variables with Compound Names and Comments as Signs of Fault-Prone Java Methods. In 4th International Workshop on Quantitative Approaches to Software Quality, 5–11

  • [S45] Jaafar, F., Guéhéneuc, Y. G., Hamel, S., & Khomh, F. (2013, October). Mining the relationship between anti-patterns dependencies and fault-proneness. In 20th Working Conference on Reverse Engineering (WCRE), IEEE,351–360

  • [S46] D’Ambros, M., Bacchelli, A., & Lanza, M. (2010, July). On the impact of design flaws on software defects. In 10th International Conference on Quality Software (QSIC), IEEE, 23–31

  • [S47] Sjøberg, D. I., Yamashita, A., Anda, B. C., Mockus, A., & Dybå, T. (2013). Quantifying the effect of code smells on maintenance effort. IEEE Transactions on Software Engineering39(8), 1144–1156

  • [S48] Bán, D., & Ferenc, R. (2014, June). Recognizing Antipatterns and Analyzing Their Effects on Software Maintainability. In International Conference on Computational Science and Its Applications, Springer International Publishing, 337–352

  • [S49] Monden, A., Nakae, D., Kamiya, T., Sato, S. I., & Matsumoto, K. I. (2002). Software quality analysis by code clones in industrial legacy software. In Proceedings of Eighth IEEE Symposium on Software Metrics, IEEE, 87–94

  • [S50] Hall, T., Zhang, M., Bowes, D., & Sun, Y. (2014). Some code smells have a significant but small effect on faults. ACM Transactions on Software Engineering and Methodology (TOSEM)23(4), 33

  • [S51] Olbrich, S., Cruzes, D. S., Basili, V., & Zazworka, N. (2009, October). The evolution and impact of code smells: A case study of two open source systems. In Proceedings of the 2009 3rd international symposium on empirical software engineering and measurement, IEEE, 390–400

  • [S52] Yamashita, A., & Moonen, L. (2013). To what extent can maintenance problems be predicted by code smell detection?–An empirical study. Information and Software Technology55(12), 2223–2242

  • [S53] Marinescu, C., Stoenescu, Ş., & Fortiş, T. F. (2014, July). Towards the Impact of Design Flaws on the Resources Used by an Application. In International Workshop on Adaptive Resource Management and Scheduling for Cloud Computing, Springer, 180–192

  • [S54] Jiang, L., Su, Z., & Chiu, E. (2007, September). Context-based detection of clone-related bugs. In Proceedings of the the 6th joint meeting of the European software engineering conference and the ACM SIGSOFT symposium on The foundations of software engineering (pp. 55–64). ACM

  • [S55] Jaafar, F., Lozano, A., Guéhéneuc, Y. G., & Mens, K. (2017). Analyzing software evolution and quality by extracting Asynchrony change patterns. Journal of Systems and Software131, 311–322

  • [S56] Palomba, F., Bavota, G., Di Penta, M., Fasano, F., Oliveto, R., & De Lucia, A. (2018). On the diffuseness and the impact on maintainability of code smells: a large scale empirical investigation. Empirical Software Engineering23(3), 1188–1221

  • [S57] Husien, H. K., Harun, M. F., & Lichter, H. (2017). Towards a Severity and Activity based Assessment of Code Smells. Procedia Computer Science116, 460–467

  • [S58] Mondal, M., Rahman, M. S., Roy, C. K., & Schneider, K. A. (2018). Is cloned code really stable?. Empirical Software Engineering23(2), 693–770

  • [S59] Elish, M. O. (2017, July). On the association between code cloning and fault-proneness: An empirical investigation. In Computing Conference, 2017 (pp. 928–935). IEEE

  • [S60] Zhang, X., Zhou, Y., & Zhu, C. (2017, November). An Empirical Study of the Impact of Bad Designs on Defect Proneness. In Software Analysis, Testing and Evolution (SATE), 2017 International Conference on (pp. 1–9). IEEE

  • [S61] Islam, J. F., Mondal, M., Roy, C. K., & Schneider, K. A. (2017). A comparative study of software bugs in clone and non-clone code. In Proc. SEKE (pp. 436–443)

  • [S62] Palomba, F., Zanoni, M., Fontana, F. A., De Lucia, A., & Oliveto, R. (2017). Toward a smell-aware bug prediction model. IEEE Transactions on Software Engineering

  • [S63] Jaafar, F., Lozano, A., Guéhéneuc, Y. G., & Mens, K. (2017, July). On the Analysis of Co-Occurrence of Anti-Patterns and Clones. In Software Quality, Reliability and Security (QRS), 2017 IEEE International Conference on (pp. 274–284). IEEE

  • [S64] Mondal, M., Roy, C. K., & Schneider, K. A. (2017, September). Bug propagation through code cloning: An empirical study. In Software Maintenance and Evolution (ICSME), 2017 IEEE International Conference on (pp. 227–237). IEEE

  • [S65] Rahman, M. S., & Roy, C. K. (2017, September). On the relationships between stability and bug-proneness of code clones: An empirical study. In Source Code Analysis and Manipulation (SCAM), 2017 IEEE 17th International Working Conference on (pp. 131–140). IEEE

  • [S66] Aman, H., Amasaki, S., Yokogawa, T., & Kawahara, M. (2017, August). Empirical Analysis of Words in Comments Written for Java Methods. In Software Engineering and Advanced Applications (SEAA), 2017 43rd Euromicro Conference on(pp. 375–379). IEEE

  • [S67] Bán, D. (2017). The connection between antipatterns and maintainability in Firefox. Acta Cybernetica23(2), 471–490

  • [S68] Chen, Z., Chen, L., Ma, W., Zhou, X., Zhou, Y., & Xu, B. (2018). Understanding metric-based detectable smells in Python software: A comparative study. Information and Software Technology94, 14–29

  • [S69] Selim, G. M., Barbour, L., Shang, W., Adams, B., Hassan, A. E., & Zou, Y. (2010, October). Studying the impact of clones on software defects. In Reverse Engineering (WCRE), 2010 17th Working Conference on (pp. 13–21). IEEE

  • [S70] Barbour, L., An, L., Khomh, F., Zou, Y., & Wang, S. (2017). An investigation of the fault-proneness of clone evolutionary patterns. Software Quality Journal, 1–36

  • [S71] Geiger, R., Fluri, B., Gall, H. C., & Pinzger, M. (2006, March). Relation of code clones and change couplings. In International Conference on Fundamental Approaches to Software Engineering (pp. 411–425). Springer, Berlin, Heidelberg

  • [S72] Kamei, Y., Sato, H., Monden, A., Kawaguchi, S., Uwano, H., Nagura, M., … & Ubayashi, N. (2011, November). An empirical study of fault prediction with code clone metrics. In Software Measurement, 2011 Joint Conference of the 21st Int’l Workshop on and 6th Int’l Conference on Software Process and Product Measurement (IWSM-MENSURA) (pp. 55–61). IEEE

  • [S73] Taba, S. E. S., Khomh, F., Zou, Y., Hassan, A. E., & Nagappan, M. (2013, September). Predicting bugs using antipatterns. In Software Maintenance (ICSM), 2013 29th IEEE International Conference on (pp. 270–279). IEEE

  • [S74] Saboury, A., Musavi, P., Khomh, F., & Antoniol, G. (2017, February). An empirical study of code smells in javascript projects. In 2017 IEEE 24th international conference on software analysis, evolution and reengineering (SANER) (pp. 294–305). IEEE

Download references

Acknowledgements

Author would like to acknowledge Kitchenham and Charters for providing the guidelines to conduct a Systematic Literature Review. In addition, the author would also like to thank Ms. Satnam kaur, and Dr. Gaurav Dhiman for their valuable time and support for assisting in collecting and analyzing the required data.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Amandeep Kaur.

Ethics declarations

Conflict of interest

The author declares that she has no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kaur, A. A Systematic Literature Review on Empirical Analysis of the Relationship Between Code Smells and Software Quality Attributes. Arch Computat Methods Eng 27, 1267–1296 (2020). https://doi.org/10.1007/s11831-019-09348-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11831-019-09348-6

Navigation