Skip to main content

Advertisement

Log in

Comparing technology-related teacher professional development designs: a multilevel study of teacher and student impacts

  • Research Article
  • Published:
Educational Technology Research and Development Aims and scope Submit manuscript

Abstract

This article presents a quasi-experimental study comparing the impact of two technology-related teacher professional development (TTPD) designs, aimed at helping junior high school science and mathematics teachers design online activities using the rapidly growing set of online learning resources available on the Internet. The first TTPD design (tech-only) focused exclusively on enhancing technology knowledge and skills for finding, selecting, and designing classroom activities with online resources, while the second (tech + pbl) coupled technology knowledge with learning to design problem-based learning (PBL) activities for students. Both designs showed large pre-post gains for teacher participants (N = 36) in terms of self-reported knowledge, skills, and technology integration. Significant interaction effects show that teachers in the tech + pbl group had larger gains for self-reported knowledge and externally rated use of PBL. Three generalized estimating equation (GEE) models were fit to study the impact on students’ (N = 1,247) self reported gains in behavior, knowledge, and attitudes. In the resulting models, students of tech + pbl teachers showed significant increases in gain scores for all three outcomes. By contrast, students of tech-only teachers showed improved gains only in attitudes.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  • Angeli, C., & Valanides, N. (2005). Preservice teachers as ICT designers: An instructional systems design model based on an expanded view of pedagogical content knowledge. Journal of Computer Assisted Learning, 21(4), 292–302.

    Article  Google Scholar 

  • Angeli, C., & Valanides, N. (2009). Epistemological and methodological issues for the conceptualization, development, and assessment of ICT-TPCK: Advances in technological pedagogical content knowledge (TPCK). Computers & Education, 52, 154–168.

    Article  Google Scholar 

  • Archambault, L., & Barnett, J. (2010). Exploring the nature of technological pedagogical content knowledge using factor analysis. Paper presented at the American Educational Research Association annual conference, Denver, CO.

  • Archambault, L., & Crippen, K. (2009). Examining TPACK among K-12 online distance educators in the United States. Contemporary Issues in Technology and Teacher Education, 9(1), 71–88.

    Google Scholar 

  • Ballinger, G. A. (2004). Using generalized estimating equations for longitudinal data analysis. Organizational Research Methods, 7(2), 127–150.

    Article  Google Scholar 

  • Barrows, H. S. (1986). A taxonomy of problem-based learning methods. Medical Education, 20(6), 481–486.

    Article  Google Scholar 

  • Barrows, H. S. (1996). Problem-based learning in medicine and beyond: A brief overview. New Directions for Teaching and Learning, 68, 3–16.

    Article  Google Scholar 

  • Barrows, H. S., & Tamblyn, R. M. (1980). Problem-based learning: An approach to medical education. Springer Series on Medical Education. New York: Springer Publishing Company.

  • Becker, H. J. (2000). Findings from the teaching, learning, and computing survey: Is Larry Cuban right? Education Policy Analysis Archives, 8(51). Retrieved from http://www.eric.ed.gov/ERICWebPortal/detail?accno=EJ622351.

  • Borgman, C., Abelson, H., Dirks, L., Johnson, R., Koedinger, K., Linn, M., et al. (2008). Fostering learning in the networked world: The cyberlearning opportunity and challenge, a 21st century agenda for the national science foundation (pp. 62). Arlington, VA: National Science Foundation, Report of the NSF Task Force on Cyberlearning. Retrieved from http://www.nsf.gov/pubs/2008/nsf08204/nsf08204.pdf.

  • Borko, H. (2004). Professional development and teacher learning: Mapping the terrain. Educational Researcher, 33(8), 3–15.

    Article  Google Scholar 

  • Brown, M., & Edelsen, D. (2003). Teaching as design. Evanston: LETUS.

    Google Scholar 

  • Brush, T. (2003). Introduction to the special issue on Preparing Tomorrow’s Teachers To Use Technology (PT3). Educational Technology Research and Development, 51(1), 57–72.

    Article  Google Scholar 

  • Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: Lawrence Earlbaum Associates.

    Google Scholar 

  • Cui, J. (2007). QIC program and model selection in GEE analyses. The Stata Journal, 7(2), 209–220.

    Google Scholar 

  • Davis, E. A., & Krajcik, J. S. (2005). Designing educative curriculum materials to promote teacher learning. Educational Researcher, 34(3), 3–14.

    Article  Google Scholar 

  • Desimone, L. M. (2009). Improving impact studies of teachers’ professional development: Toward better conceptualizations and measures. Educational Researcher, 38(3), 181–199.

    Article  Google Scholar 

  • Dick, W., Carey, L., & Carey, J. O. (2001). The systematic design of instruction (5th ed.). New York: Addison-Wesley Educational Publishers Inc.

    Google Scholar 

  • Ertmer, P. A. (2005). Teacher pedagogical beliefs: The final frontier in our quest for technology integration? Educational Technology Research and Development, 53(4), 25–39.

    Article  Google Scholar 

  • Ferguson, C. J. (2009). An effect size primer: A guide for clinicians and researchers. Professional Psychology: Research and Practice, 40(5), 532–538.

    Article  Google Scholar 

  • Finkelstein, N., Chun-Wei, K., & Ravitz, J. (2011). Effects of problem-based economics on high school economics instruction. Paper presented at the annual meeting of the American Education Research Association, New Orleans.

  • Fishman, B. J., Marx, R. W., Best, S., & Tal, R. T. (2003). Linking teacher and student learning to improve professional development in systemic reform. Teaching and teacher education, 19(6), 643–658.

    Article  Google Scholar 

  • Fleiss, J., & Cohen, J. (1973). The equivalence of weighted kappa and the intraclass correlation coefficient as measures of reliability. Educational and Psychological Measurement, 33(3), 613–619.

    Article  Google Scholar 

  • Fletcher, D. (2006). Technology integration: Do they or don’t they? A self-report survey from PreK through 5th grade professional educators. AACE Journal, 14(3), 207–219.

    Google Scholar 

  • Garet, M. S., Porter, A. C., Desimone, L., Birman, B. F., & Yoon, K. S. (2001). What makes professional development effective? Results from a national sample of teachers. American Educational Research Journal, 38(4), 915–945.

    Article  Google Scholar 

  • Graham, C., Burgoyne, N., Cantrell, P., Smith, L., Clair, L. S., & Harris, R. (2009). TPACK development in science teaching: Measuring the TPACK confidence of inservice science teachers. Tech Trends, 53(5), 70–79.

    Article  Google Scholar 

  • Gurell, S., Kuo, Y.-C., & Walker, A. (2010). The pedagogical enhancement of open education: An examination of problem-based learning. The International Review of Review of Research in Open and Distance Learning, 11(3), 95–105.

    Google Scholar 

  • Hedeker, D., & Gibbons, R. D. (2006). Longitudinal data analysis. Hoboken, NJ: John Wiley & Sons.

    Google Scholar 

  • Hmelo-Silver, C. E., & Barrows, H. S. (2008). Facilitating collaborative knowledge building. Cognition and Instruction, 26(1), 48–94.

    Article  Google Scholar 

  • Horton, N. J., & Lipsitz, J. H. (1999). Review of software to fit generalized estimating equation regression models. The American Statistician, 53, 160–169.

    Google Scholar 

  • Johnson, R. L., Penny, J., & Gordon, B. (2010). The relation between score resolution methods and interrater reliability: An empirical study of an analytic scoring rubric. Applied Measurement in Education, 13(2), 121–138.

    Article  Google Scholar 

  • Khoo, M., Pagano, J., Washington, A. L., Recker, M., Palmer, B., & Donahue, R. A. (2008). Using web metrics to analyze digital libraries. Proceedings of the 8th ACM/IEEE-CS joint conference on digital libraries, JCDL’08 (pp. 375–384). New York, NY: ACM.

  • Koehler, M., & Mishra, P. (2005a). Teachers learning technology by design. Journal of Computing in Teacher Education, 21(3), 94–102.

    Google Scholar 

  • Koehler, M., & Mishra, P. (2005b). What happens when teachers design educational technology? The development of technological pedagogical content knowledge. Journal of Educational Computing Research, 32(2), 131–152.

    Article  Google Scholar 

  • Koehler, M., & Mishra, P. (2008). Introducing TPCK. Handbook of Technological Pedagogical Content Knowledge (TPCK) for educators (pp. 3–30). New York: Routledge.

  • Kopcha, T. J., & Sullivan, H. (2007). Self-presentation bias in surveys of teachers’ educational technology practices. Educational Technology Research and Development, 55(6), 626–627.

    Article  Google Scholar 

  • Kramer, B. S., Walker, A., & Brill, M. B. (2007). The underutilization of Internet and communication technology-assisted collaborative project-based learning among international educators: A delphi study. Educational Technology Research and Development, 55(5), 527–543.

    Article  Google Scholar 

  • Lawless, K. A., & Pellegrino, J. W. (2007). Professional development in integrating technology into teaching and learning: Knowns, unknowns, and ways to pursue better questions and answers. Review of Educational Research, 77(4), 575–614.

    Article  Google Scholar 

  • Liang, K.-Y., & Zeger, S. L. (1986). Longitudinal data analysis using generalized linear models. Biometrika, 73(1), 13–22.

    Article  Google Scholar 

  • Lipsey, M. W., & Wilson, D. B. (2001). Practical meta-analysis. Thousand Oaks, CA: Sage.

    Google Scholar 

  • Maddux, C. D. (2009). Information technology in education: The need for skepticism. International Journal of Technology in Teaching and Learning, 5(2), 182–190.

    Google Scholar 

  • Mardis, M. A. (2007). From one-to-one to one-to-many: A study of the practicum in the transition from teacher to school library media specialist. Journal of Education for Library and Information Science, 48(3), 218–235.

    Google Scholar 

  • McArthur, D., & Zia, L. (2008). From NSDL 1.0 to NSDL 2.0: Towards a comprehensive cyberinfrastructure for teaching and learning. Paper presented at the international conference on digital libraries (pp. 66–69). Pittsburgh, PA: ACM.

  • Mishra, P., & Koehler, M. (2006). Technological pedagogical content knowledge: A framework for teacher knowledge. The Teachers College Record, 108(6), 1017–1054.

    Article  Google Scholar 

  • Niess, M. L. (2005). Preparing teachers to teach science and mathematics with technology: Developing a technology pedagogical content knowledge. Teaching and Teacher Education, 21(5), 509–523.

    Article  Google Scholar 

  • Pan, W. (2001). Akaike’s information criterion in generalized estimating equations. Biometrics, 57(1), 120–125.

    Article  Google Scholar 

  • Patton, C., & Roschelle, J. (2008). Why the best math curriculum won’t be a textbook. Educational Week. Retrieved from http://www.edweek.org/ew/articles/2008/05/07/36patton.h27.html.

  • Recker, M. (2006). Perspectives on teachers as digital library users: Consumers, contributors, and designers. D-Lib Magazine, 12(9).

  • Recker, M., Dorward, J., Dawson, D., Halioris, S., Liu, Y., Mao, X., Palmer, B., & Park, J. (2005). You can lead a horse to water: Teacher development and use of digital library resources. Proceedings of the Joint Conference on Digital Libraries. New York, NY: ACM.

  • Reeves, T. C., & Laffey, J. M. (1999). Design, assessment and evaluation of a problem-based learning environment in undergraduate engineering. Higher Education Research & Development, 18(2), 219–232.

    Article  Google Scholar 

  • Remillard, J. (2005). Examining key concepts in research on teachers’ use of mathematics curricula. Review of Educational Research, 75(2), 211–246.

    Article  Google Scholar 

  • Robertshaw, M. B., Walker, A., Recker, M., Leary, H., & Sellers, L. (2010). Experiences in the field: The evolution of a technology-oriented teacher professional development model. In M. S. Khine & I. M. Saleh (Eds.), New science of learning: computers, cognition and collaboration in education (pp. 307–323). New York: Springer.

    Google Scholar 

  • Roschelle, J., Shechtman, N., Tatar, D., Hegedus, S., Hopkins, B., Empson, S., et al. (2010). Integration of technology, curriculum, and professional development for advancing middle school mathematics. American Educational Research Journal, 47(4), 833–878.

    Article  Google Scholar 

  • Rotnitzky, A., & Jewell, N. P. (1990). Hypothesis testing of regression parameters in semiparametric generalized linear models for cluster correlated data. Biometrika, 77(3), 485–497.

    Article  Google Scholar 

  • Savery, J. R. (2006). Overview of problem-based learning: Definitions and distinctions. The interdisciplinary Journal of Problem-based Learning, 1(1), 9–20.

    Google Scholar 

  • Schlager, M. S., Farooq, U., Fusco, J., Schank, P., & Dwyer, N. (2009). Analyzing online teacher networks: Cyber networks require cyber research tools. Journal of Teacher Education, 60(1), 86–100.

    Article  Google Scholar 

  • Shrout, P., & Fleiss, Joseph. (1979). Intraclass correlations: Uses in assessing rater reliability. Psychological Bulletin, 86(2), 420–428.

    Article  Google Scholar 

  • Shulman, L. S. (1986). Those who understand: Knowledge growth in teaching. Educational Researcher, 15(2), 4–14.

    Google Scholar 

  • Sim, J., & Wright, C. C. (2005). The Kappa statistic in reliability studies: Use, interpretation, and sample size requirements. Physical Therapy, 85(3), 257–268.

    Google Scholar 

  • Stevens, J. (1999). Intermediate statistics: A modern approach. Mahwah, NJ: Lawrence Erlbaum.

    Google Scholar 

  • U.S. Department of Education. (2010). National educational technology plan 2010—Transforming American education: Learning powered by technology. Retrieved from http://www.ed.gov/technology/netp-2010.

  • Walker, A., & Leary, H. (2009). A problem-based learning meta analysis: Differences across problem types, implementation types, disciplines, and assessment levels. Interdisciplinary Journal of Problem-based Learning, 3(1), 12–43.

    Google Scholar 

  • Walker, A., Recker, M., Robertshaw, M. B., Olsen, J., Sellers, L., Leary, H., Kuo, Y.-C., & Ye, L. (2011). Designing for problem based learning: A comparative study of technology professional development. Presented at the American Educational Research Association conference, New Orleans, LA.

  • Walker, A., & Shelton, B. (2008). Problem-based learning informed educational game design. Journal of Interactive Learning Research, 19(4), 663–684.

    Google Scholar 

  • Wayne, A. J., Yoon, K. S., Zhu, P., Cronen, S., & Garet, M. S. (2008). Experimenting with teacher professional development: Motives and methods. Educational Researcher, 37(8), 469–479.

    Article  Google Scholar 

Download references

Acknowledgments

This material is based upon work supported by the National Science Foundation under (grant # 0937630). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation. We thank the district science coordinator, participating teachers, and students in our study.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Andrew Walker.

Appendix A

Appendix A

Criteria

Not present (0)

Emerging (1)

Present (2)

Authentic problem

 Cross-disciplinary

Content draws from a single discipline (e.g., statistics)

Content draws from two closely related disciplines (e.g., statistics and algebra)

Content draws from a diverse set of disciplines, reflecting the kind of complexity found in real life settings (e.g., statistics, and rhetoric)

 Ill-structured

Learners are provided with clear directions

Learners are provided with parameters but need to make some decisions about how to proceed

Learners need to act within parameters and are faced with competing constraints, forcing a “satisficing” solution (e.g., students are asked to pick food that is cheap as well as healthy)

 Real life

No ties to real life practice

Attempted ties to real life practice. Something done by professionals, or authentic for students

Learning is clearly tied to real life practice. For example, the problem is phrased in the first person for students, they are given artifacts associated with the problem

 Begins with a problem

No contextual problem is presented to learners

Learners are asked to solve a contextual problem (content first)

Learners are asked to solve a contextual problem (problem first then content)

Learning processes

 Learning goals

Students play no role in deciding what to learn

Students have limited choice about what to learn

Students choose the majority of what they learn

 Resource utilization

Learners are not prompted to locate/use any resources

Learners are asked to search for resources or utilize provided resources

Learners are asked to search for resources or utilize provided resources. Additionally they are encouraged to pay attention to the quality of resources they find or use

 Reflection

Learners are not asked to reflect

Learners are asked to discuss what they have found or judge the merits of their own actions or the actions of their peers

Learners are asked to discuss what they found and judge the merits of their own actions or the actions of their peer

Facilitator

 Metacognition

Unclear exactly what facilitators do during the activity

As part of the activity, facilitators engage in some meta-cognitive prompts

As part of the activity, facilitators focus their efforts on providing meta-cognitive prompts (e.g., How helpful is your current line of reasoning? What do you need to do next? Can you summarize our discussion to this point?)

 Information source

Facilitators are primary source of info. This either comes directly from the instructor or a mandated set of materials

Information comes partly from facilitators and is partly found by learners

Information is found primarily by learners. Sources include searching, or distilling relevant information from a larger set of provided materials

Group work

 Learners interact in groups

The learning experience is done individually

Parts of the learning done individually parts are done as a group

The majority of the learning is done in groups

Rights and permissions

Reprints and permissions

About this article

Cite this article

Walker, A., Recker, M., Ye, L. et al. Comparing technology-related teacher professional development designs: a multilevel study of teacher and student impacts. Education Tech Research Dev 60, 421–444 (2012). https://doi.org/10.1007/s11423-012-9243-8

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11423-012-9243-8

Keywords

Navigation