skip to main content
10.1145/2538862.2538973acmconferencesArticle/Chapter ViewAbstractPublication PagessigcseConference Proceedingsconference-collections
research-article

Can computers compare student code solutions as well as teachers?

Authors Info & Claims
Published:05 March 2014Publication History

ABSTRACT

In introductory programming courses it is common to demand from students exercises based on the production of code. However, it is difficult for the teacher to give fast feedback to the students about the main solutions tried, the main errors and the drawbacks and advantages of certain solutions. If we could use automatic code comparison algorithms to build visualisation tools to support the teacher in analysing how each solution provided is similar or different from another, such information would be able to be rapidly obtained. However, can computers compare students code solutions as well as teachers? In this work we present an experiment in which we have requested teachers to compare different code solutions to the same problem. Then we have evaluated the level of agreement among each teacher comparison strategy and some algorithms generally used for plagiarism detection and automatic grading. We found out a maximum rate of 77% of agreement between one of the teachers and the algorithms, but a minimum agreement of 75%. However, for most of the teachers, the maximum agreement rate was over 90% for at least one of the automatic strategies to compare code. We have also detected that the level of agreement among teachers regarding their personal strategies to compare students solutions was between 62% and 95%, which shows that there may be more agreement between a teacher and an algorithm than between a teacher and one of her colleagues regarding their strategies to compare students' solutions. The results also seem to support that comparison of students' codes has significant potential to be automated to help teachers in their work.

References

  1. A. Chamillard and K. A. Braun. Evaluating programming ability in an introductory computer science course. ACM SIGCSE Bulletin, 32(1):212--216, 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. J. L. Donaldson, A.-M. Lancaster, and P. H. Sposato. A plagiarism detection system. In ACM SIGCSE Bulletin, volume 13, pages 21--25. ACM, 1981. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. C. D. Manning, P. Raghavan, and H. Schütze. Introduction to Information Retrieval. Cambridge University Press, New York, NY, USA, 2008. Google ScholarGoogle ScholarCross RefCross Ref
  4. C. D. Manning, P. Raghavan, and H. Schütze. Introduction to information retrieval, volume 1. Cambridge University Press Cambridge, 2008. Google ScholarGoogle ScholarCross RefCross Ref
  5. K. A. Naudé. Assessing program code through static structural similarity. PhD thesis, Nelson Mandela Metropolitan University, jan 2007.Google ScholarGoogle Scholar
  6. C. K. Roy. Detection and analysis of near-miss software clones. PhD thesis, Kingston, Ont., Canada, Canada, 2009. AAINR65337. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. M. Sahami, S. Roach, E. Cuadros-Vargas, and R. LeBlanc. Acm/ieee-cs computer science curriculum 2013: reviewing the ironman report. In Proceeding of the 44th ACM technical symposium on Computer science education, pages 13--14. ACM, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. P.-N. Tan, M. Steinbach, and V. Kumar. Introduction to Data Mining, (First Edition). Addison-Wesley Longman Publishing Co., Inc., Boston, MA, USA, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. K. Zhang and D. Shasha. Simple fast algorithms for the editing distance between trees and related problems. SIAM J. Comput., 18(6):1245--1262, Dec. 1989. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Can computers compare student code solutions as well as teachers?

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      SIGCSE '14: Proceedings of the 45th ACM technical symposium on Computer science education
      March 2014
      800 pages
      ISBN:9781450326056
      DOI:10.1145/2538862

      Copyright © 2014 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 5 March 2014

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      SIGCSE '14 Paper Acceptance Rate108of274submissions,39%Overall Acceptance Rate1,595of4,542submissions,35%

      Upcoming Conference

      SIGCSE Virtual 2024
      SIGCSE Virtual 2024: ACM Virtual Global Computing Education Conference
      November 30 - December 1, 2024
      Virtual Event , USA

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader