skip to main content
10.1145/1953163.1953299acmconferencesArticle/Chapter ViewAbstractPublication PagessigcseConference Proceedingsconference-collections
research-article

CodeWrite: supporting student-driven practice of java

Published:09 March 2011Publication History

ABSTRACT

Drill and practice exercises enable students to master skills needed for more sophisticated programming. A barrier to providing such activities is the effort required to set up the programming environment. Testing is an important component to writing good software, but it is difficult to motivate students to write tests. In this paper we describe and evaluate CodeWrite, a web-based tool that provides drill and practice support for Java programming, and for which testing plays a central role in its use. We describe how we have used CodeWrite in a CS1 course, and demonstrate its effectiveness in providing good coverage of the language features presented in the course.

References

  1. R. Ballantyne, K. Hughes, and A. Mylonas. Developing procedures for implementing peer assessment in large classes using an action research process. Assessment and Evaluation in Higher Education, 27(5):427--441, 2002.Google ScholarGoogle ScholarCross RefCross Ref
  2. E. G. Barriocanal, M.-A. S. Urbán, I. A. Cuevas, and P. D. Pérez. An experience in integrating automated unit testing practices in an introductory programming course. SIGCSE Bull., 34(4):125--128, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. C. Desai, D. Janzen, and K. Savage. A survey of evidence for test-driven development in academia. SIGCSE Bull., 40(2):97--101, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. C. Desai, D. S. Janzen, and J. Clements. Implications of integrating test-driven development into cs1/cs2 curricula. In SIGCSE '09, pages 148--152, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. C. Edmondson. Proglets for first-year programming in java. SIGCSE Bull., 41(2):108--112, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. S. H. Edwards. Improving student performance by evaluating how well students test their own programs. J. Educ. Resour. Comput., 3(3):1, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. S. H. Edwards, J. Börstler, L. N. Cassel, M. S. Hall, and J. Hollingsworth. Developing a common format for sharing programming assignments. SIGCSE Bull., 40(4):167--182, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. E. F. Gehringer, D. D. Chinn, M. A. Pérez-Quinones, and M. A. Ardis. Using peer review in teaching computing. In SIGCSE '05, pages 321--322, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. D. Janzen and H. Saiedian. Test-driven learning in early programming courses. In SIGCSE '08, pages 532--536, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. A. Luxton-Reilly. A systematic review of tools that support peer assessment. Computer Science Education, 19(4):209--232, Dec 2009.Google ScholarGoogle ScholarCross RefCross Ref
  11. A. Luxton-Reilly and P. Denny. Constructive evaluation: a pedagogy of student-contributed assessment. Computer Science Education, 20:145--167, 2010. Accepted for publication.Google ScholarGoogle ScholarCross RefCross Ref
  12. T. J. Mccabe. A complexity measure. Software Engineering, IEEE Transactions on, SE-2(4):308--320, 1976. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. P. D. Palma. Viewpoint: Why women avoid computer science. Commun. ACM, 44(6):27--30, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. N. Parlante. Nifty reflections. SIGCSE Bull., 39(2):25--26, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. W. Perry. Forms of Intellectual and Ethical Development in the College Years. Holt, Rinehart, Winston, New York, 1970.Google ScholarGoogle Scholar
  16. T. Shepard, M. Lamb, and D. Kelly. More testing should be taught. Commun. ACM, 44(6):103--108, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. J. Spacco and W. Pugh. Helping students appreciate test-driven development (tdd). In OOPSLA '06 Companion, pages 907--913, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. J. Spacco, W. Pugh, N. Ayewah, and D. Hovemeyer. The marmoset project: an automated snapshot, submission, and testing system. In OOPSLA '06 Companion, pages 669--670, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. D. E. Stevenson and P. J. Wagner. Developing real-world programming assignments for cs1. In ITICSE '06, pages 158--162, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. CodeWrite: supporting student-driven practice of java

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      SIGCSE '11: Proceedings of the 42nd ACM technical symposium on Computer science education
      March 2011
      754 pages
      ISBN:9781450305006
      DOI:10.1145/1953163

      Copyright © 2011 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 9 March 2011

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      SIGCSE '11 Paper Acceptance Rate107of315submissions,34%Overall Acceptance Rate1,595of4,542submissions,35%

      Upcoming Conference

      SIGCSE Virtual 2024

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader