Skip to main content

Advertisement

Log in

A Model of Cognition: The Missing Cornerstone of Assessment

  • Essay
  • Published:
Educational Psychology Review Aims and scope Submit manuscript

Abstract

When we rely upon gains on some measure to support statements of prescription, we have the obligation to ensure that those measures are valid. Nearly 10 years after an influential National Research Council (2001) report on educational assessment identified an explicit model of cognition as one of three necessary components of a valid assessment system, we note that most measures still lack this fundamental cornerstone. In this paper, we draw attention to the construct modeling approach to assessment that strives for coherence and consistency with a model of cognition in which student proficiency varies along a continuum of competence. This approach is illustrated in the context of an assessment of conceptual understanding of certain scientific phenomena given to undergraduates at a large public university (National Research Council 2001).

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Notes

  1. By using the term internal validity, we mean to exclude for the moment discussion of evidence based on relations to other variables and evidence based on consequences of testing (American Educational Research Association et al. 1999).

References

  • Adams, R., & Wu, M. (2011). The construction and implementation of user-defined fit tests for use with marginal maximum likelihood estimation and generalised item response models. In N. J. S. Brown, B. Duckor, K. Draney, & M. Wilson (Eds.), Advances in Rasch measurement (Vol. 2). Maple Grove: JAM Press.

    Google Scholar 

  • Adams, R., Wilson, M., & Wang, W. (1997). The multidimensional random coefficients multinomial logit model. Appl Psychol Meas, 21, 1–23.

    Article  Google Scholar 

  • American Educational Research Association, American Psychological Association, & National Council on Measurement in Education. (1999). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.

    Google Scholar 

  • Andrich, D. (1978a). Application of a psychometric rating model to ordered categories which are scored with successive integers. Appl Psychol Meas, 2, 581–594.

    Article  Google Scholar 

  • Andrich, D. (1978b). A rating formulation for ordered response categories. Psychometrika, 43, 561–573.

    Article  Google Scholar 

  • Briggs, D. C., Alonzo, A. C., Schwab, C., & Wilson, M. (2006). Diagnostic assessment with ordered multiple-choice items. Educ Assess, 11, 33–63.

    Article  Google Scholar 

  • Brown, N. J. S. (2005). The multidimensional measure of conceptual complexity. Berkeley: University of California.

    Google Scholar 

  • Chase, W. G., & Simon, H. A. (1973). Perception in chess. Cognitive Psychology, 4, 55–81.

    Article  Google Scholar 

  • Chi, M. T. H., Feltovich, P. J., & Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5, 121–152.

    Article  Google Scholar 

  • Cronbach, L. J. (1988). Five perspectives on validity argument. In H. Wainer & H. Braun (Eds.), Test validity (pp. 3–17). Hillsdale: Lawrence Erlbaum.

    Google Scholar 

  • Crooks, T. J., Kane, M. T., & Cohen, A. S. (1996). Threats to the valid use of assessments. Assessment in Education: Principles, Policy & Practice, 3, 265–285.

    Article  Google Scholar 

  • De Boeck, P., & Wilson, M. (Eds.). (2004). Explanatory item response models: A generalized linear and nonlinear approach. New York: Springer-Verlag.

  • Embretson, S. E., & Reise, S. P. (2000). Item response theory for psychologists. Mahwah, NJ: LEA.

  • Fischer, G. (1983). Logistic latent trait models with linear constraints. Psychometrika, 48, 3–26.

    Article  Google Scholar 

  • Hickey, D. T., Zuiker, S. J., Taasoobshirazi, G., Schafer, N. J., & Michael, M. A. (2006). Balancing varied assessment functions to attain systemic validity: Three is the magic number. Studies in Educational Evaluation, 32, 180–201.

    Article  Google Scholar 

  • Jenkins, H. (2009). Confronting the challenges of participatory culture: Media education for the 21st century. Cambridge: Massachusetts Institute of Technology.

    Google Scholar 

  • Kane, M. T. (2001). Current concerns in validity theory. Journal of Educational Measurement, 38, 319–342.

    Article  Google Scholar 

  • Kane, M. T. (2006). Validation. In R. L. Brennan (Ed.), Educational measurement (4th ed., pp. 17–64). Lanham: Rowman & Littlefield.

    Google Scholar 

  • Linacre, J. (1990). Many-facet Rasch measurement. Chicago: MESA Press.

    Google Scholar 

  • Marton, F. (1986). Phenomenography: A research approach to investigating different understandings of reality. Journal of Thought, 21, 29–49.

    Google Scholar 

  • Masters, G. (1982). A Rasch model for partial credit scoring. Psychometrika, 47, 149–174.

    Article  Google Scholar 

  • Messick, S. (1989). Meaning and values in test validation: The science and ethics of assessment. Educational Researcher, 18, 5–11.

    Google Scholar 

  • Messick, S. (1995). Validity of psychological assessment: Validation of inferences from persons’ responses and performances as scientific inquiry into score meaning. The American Psychologist, 50, 741–749.

    Article  Google Scholar 

  • Mislevy, R. J. (1996). Test theory reconceived. Journal of Educational Measurement, 33, 379–416.

    Article  Google Scholar 

  • Mislevy, R. J., & Riconscente, M. M. (2006). Evidence-centered assessment design. In S. M. Downing & T. M. Haladyna (Eds.), Handbook of test development (pp. 61–90). Mahwah: Lawrence Erlbaum Associates.

    Google Scholar 

  • Mislevy, R. J., Steinberg, L. S., Breyer, F. J., Almond, R. G., & Johnson, L. (2002a). Making sense of data from complex assessments. Applied Measurement in Education, 15, 363–389.

    Article  Google Scholar 

  • Mislevy, R. J., Wilson, M. R., Ercikan, K., & Chudowsky, N. (2002b). Psychometric principles in student assessment. CSE Technical Report. Los Angeles, CA: Center for the Study of Evaluation, University of California.

  • National Research Council. (2001). Knowing what students know: The science and design of educational assessment. Committee on the Foundations of Assessment. J. Pellegrino, N. Chudowsky, & R. Glaser (Eds). Washington, DC: National Academies Press.

  • National Research Council. (2006). Systems for state science assessment. Committee on Test Design for K-12 Science Achievement. M. Wilson, & M. Bertenthal (Eds.). Washington, D.C.: National Academies Press.

  • National Research Council. (2007). Taking Science to School: Learning and Teaching Science in Grades K-8. Committee on Science Learning, Kindergarten through Eighth Grade. R. A. Duschl, H. A. Schweingruber, & A. W. Shouse (Eds.). Washington, D.C.: National Academies Press.

  • Rasch, G. (1980). Probabilistic models for some intelligence and attainment tests. Chicago: University of Chicago Press. Original work published in 1960.

    Google Scholar 

  • Ruiz-Primo, M. A., Shavelson, R. J., Hamilton, L., & Klein, S. (2002). On the evaluation of systemic science education reform: Searching for instructional sensitivity. Journal of Research in Science Teaching, 39, 369–393.

    Article  Google Scholar 

  • Shulman, L. S. (1986). Those who understand: Knowledge growth in teaching. Educational Researcher, 15, 4–14.

    Google Scholar 

  • Shulman, L. S. (1987). Knowledge and teaching: Foundations of the new reform. Harvard Educational Review, 57, 1–22.

    Google Scholar 

  • Smith, C. L., Wiser, M., Anderson, C. W., & Krajcik, J. (2006). Implications of research on children’s learning for standards and assessment: A proposed learning progression for matter and the atomic-molecular theory. Measurement, 14, 1–98.

    Google Scholar 

  • Songer, N. B., Kelcey, B., & Gotwals, A. (2009). How and when does complex reasoning occur? Empirically driven development of a learning progression focused on complex reasoning about biodiversity. Journal of Research in Science Teaching, 46, 610–631.

    Article  Google Scholar 

  • Wilson, M. (1989). Saltus: A psychometric model of discontinuity in cognitive development. Psychological Bulletin, 105, 276–289.

    Article  Google Scholar 

  • Wilson, M. (1992). The ordered partition model: An extension of the partial credit model. Applied Psychological Measurement, 16, 309–325.

    Article  Google Scholar 

  • Wilson, M. (2005). Constructing measures: An item response modeling approach. Mahwah: Lawrence Erlbaum.

    Google Scholar 

  • Wilson, M. (2009). Measuring progressions: Assessment structures underlying a learning progression. Journal of Research in Science Teaching, 46, 716–730.

    Article  Google Scholar 

  • Wilson, M., & Adams, R. (1995). Rasch models for item bundles. Psychometrika, 60, 181–198.

    Article  Google Scholar 

  • Wright, B., & Masters, G. (1982). Rating scale analysis: Rasch measurement. Chicago: MESA Press.

    Google Scholar 

  • Wu, M. (1997). The development and application of a fit test for use with marginal maximum estimation and generalized item response models. Master’s thesis, Victoria, Australia: University of Melbourne

  • Wu, M., Adams, R., Wilson, M., & Haldane, S. (2007). ACER ConQuest version 2.0: Generalised item response modelling software [Computer software and manual]. Camberwell: ACER Press.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nathaniel J. S. Brown.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Brown, N.J.S., Wilson, M. A Model of Cognition: The Missing Cornerstone of Assessment. Educ Psychol Rev 23, 221–234 (2011). https://doi.org/10.1007/s10648-011-9161-z

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10648-011-9161-z

Keywords

Navigation