Skip to main content
Log in

Assessment Practices and Students Knowledge Profiles in a Problem-based Curriculum

  • Published:
Learning Environments Research Aims and scope Submit manuscript

Abstract

Since the mid-1980s, many new terms have enriched the assessment literature, such as performance assessment, authentic assessment, direct assessment and curriculum-embedded assessment. This boom is a result of changes in instructional as well as in assessment approaches. Criteria for good instruction as well as good assessment practices are suggested, derived from research-based models in the field of cognitive psychology and expert-novice studies. This article first reports on the translation of these criteria into a set of the characteristics of the assessment system of a problem-based curriculum in the field of Economics and Business Administration. Secondly, the article reports a study on improving assessment practices. Is it important to map students' knowledge profiles when attempting to remediate problem-solving performances? The answer to this question depends on the extent to which a student's problem-solving performance is influenced by the quality of his/her knowledge profile. Students' knowledge profile is measured by a Knowledge Test and a Sorting Task. Students' problem-solving skills are assessed by an Overall Test. The results indicate that students with an organised knowledge base perform better in problem-solving situations than students whose conceptual models are loosely structured. The implications of these findings for instruction as well as for assessment are discussed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

REFERENCES

  • Anderson, J.R. (1983). The architecture of cognition. Cambridge, MA: Harvard University Press.

    Google Scholar 

  • Barrows, H.S. & Tamblyn, R.M. (1980). Problem-based learning: an approach to medical education. New York: Springer.

    Google Scholar 

  • Baxter, G.P. & Shavelson, R.J. (1994). Science performance assessments: benchmarks and surrogates. International Journal of Educational Research, 21, 279–299.

    Google Scholar 

  • Birenbaum, M. (1996). Assessment 2000: towards a pluralistic approach to assessment. In M. Birenbaum & F.J.R.C. Dochy (Eds.), Alternatives in assessment of achievements, learning processes and prior knowledge (pp. 3–29). Boston, MA: Kluwer.

    Google Scholar 

  • Bransford, J.D., Vye, N.J., Adams, L.T. & Perfetto, G.A. (1989). Learning skills and the acquisition of knowledge. In A. Lesgold & R. Glaser (Eds.), Foundations for a psychology of education (pp. 199–249). Hillsdale, NJ: Lawrence Erlbaum.

    Google Scholar 

  • Brown, A.L., Bransford, J.D., Ferrara, R.A. & Campione, J.C. (1983). Learning, remembering and understanding. In J.H. Flavell & E.M. Markman (Eds.), Carmichaels's manual of child psychology (Vol. 1; pp. 77–166). New York: Wiley.

    Google Scholar 

  • Chi, M.T.H., de Leeuw, N., Chiu, M. & LaVancher, C. (1992, July). Self-explanations improve learning. Paper presented at the NATO Advanced Study Institute on 'Psychological and Educational Foundations of Technology-Based Learning Environments', Kolymbari, Crete.

  • Chi, M.T.H., Feltovich, P. & Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5, 121–152.

    Article  Google Scholar 

  • Collins, A. (1990). Reformulating testing to measure learning and thinking. In N. Frederiksen, R. Glaser, A. Lesgold & M.G. Shafto (Eds.), Diagnostic monitoring of skill and knowledge acquisition (pp. 75–87). Hillsdale, NJ: Lawrence Erlbaum.

    Google Scholar 

  • Coulson, R.L. & Osborne, C.E. (1984). Insuring curricular content in a student-directed problem-based learning program. In H.G. Schmidt & M.L. De Volder (Eds.), Tutorial in problem-based learning: a new direction in teaching the health professions (pp. 225–229). Assen, The Netherlands: Van Gorcum.

    Google Scholar 

  • De Corte, E. (1990, April). A state-of-the-art of research on learning and teaching. Keynote lecture presented at the first European Conference on the First Year Experience in Higher Education, Aalborg University, Aalborg, Denmark.

  • Dochy, F.J.R.C. (1992). Assessment of prior knowledge as a determinant for future learning. Utrecht, The Netherlands & London: Lemma B.V./Jessica Kingsley Publishing.

    Google Scholar 

  • Dochy, F.J.R.C. (1994a). Prior knowledge and learning. In T. Husen & T.N. Postlehwaite (Eds.), The international encyclopedia of education (2nd ed.; pp. 4698–4702). Oxford, UK: Pergamon.

    Google Scholar 

  • Dochy, F.J.R.C. (1994b). Investigating the use of knowledge profiles in a flexible learning environment: analyzing students' prior knowledge states. In S. Vosniadou, E. De Corte & H. Mandl (Eds.), Psychological and educational foundations of technologybased learning environments (pp. 235–243) (NATO ASI Series F, Special Programme AET). Berlin, Germany: Springer Verlag.

    Google Scholar 

  • Dochy, F.J.R.C. & Alexander, P.A. (1995). Mapping prior knowledge: a framework for discussion among researchers. European Journal for Psychology of Education, X, 225–242.

    Article  Google Scholar 

  • Dochy, F.J.R.C. & Moerkerke, G. (1997). The present, the past and the future of achievement testing and performance assessment. International Journal of Educational Research, 27, 415–432.

    Google Scholar 

  • Dochy, F.J.R.C., Valcke, M. & Wagemans, L. (1991). Learning economics in higher education: an investigation concerning the quality and impact of expertise. Higher Education in Europe, 4, 123–136.

    Google Scholar 

  • Dolmans, D. (1994). How students learn in a problem-based curriculum. Maastricht, The Netherlands: Universitaire Pers.

    Google Scholar 

  • Feller, M. (1994). Open-book testing and education for the future. Studies in Educational Evaluation, 20, 235–238.

    Article  Google Scholar 

  • Feltovich, P.J., Spiro, R.J. & Coulson, R.L. (1993). Learning, teaching, and testing for complex conceptual understanding. In N. Frederiksen, R.J. Mislevy & I.I. Bejar (Eds.), Test theory for a new generation of tests (pp. 181–217). Hillsdale, NJ: Lawrence Erlbaum.

    Google Scholar 

  • Glaser, R. (1990). Toward new models for assessment. International Journal of Educational Research, 14, 475–483.

    Google Scholar 

  • Glaser, R. (1992). Expert knowledge and processes of thinking. In D.F. Halpern (Ed.), Enhancing thinking skills in the sciences and mathematics (pp. 63–75). Hillsdale, NJ: Lawrence Erlbaum.

    Google Scholar 

  • Glaser, R., Lesgold, A. & Lajoie, S. (1987). Toward a cognitive theory for the measurement of achievement. In R.R. Ronning, J. Glover, J.C. Conoley & J.C. Witt (Eds.), The influence of cognitive psychology on testing and measurement (pp. 41–85). Hillsdale, NJ: Lawrence Erlbaum.

    Google Scholar 

  • Glaser, R. & Silver, E. (1994). Assessment, testing and instruction: retrospect and prospect. Review of Research in Education, 20, 393–419.

    Article  Google Scholar 

  • Hambleton, R.K. & Murphy, E. (1992). A psychometric perspective on authentic measurement. Applied Measurement in Education, 5, 1–16.

    Article  Google Scholar 

  • Letteri, C.A. (1980). Cognitive profile: basic determinant of academic achievement. Journal of Educational Research, 4, 195–198.

    Google Scholar 

  • Letteri, C.A. & Kuntz, S.W. (1982, March). Cognitive profiles: examining self-planned learning and thinking styles. Paper presented at the annual meeting of the American Educational Research Association, New York City.

  • Magone, M.E., Cai, J., Silver, E.A. & Wang, N. (1994). Validating the cognitive complexity and content quality of a mathematics performance assessment. International Journal of Educational Research, 21, 317–340.

    Google Scholar 

  • Masters, G. & Mislevy, R.J. (1993). New views of student learning: implications for educational measurement. In N. Frederiksen, R.J. Mislevy & I.I. Bejar (Eds.), Test theory for a new generation of tests (pp. 219–241). Hillsdale, NJ: Lawrence Erlbaum.

    Google Scholar 

  • Norman, D.A., Getner, D.R. & Stevens, A.L. (1976). Comments on learning schemata and memory. In D.R. Klahr (Ed.), Cognition and instruction (pp. 177–196). Hillsdale, NJ: Lawrence Erlbaum.

    Google Scholar 

  • Nuy, H.J.P. (1991). Interaction of study orientation and students' appreciation of structure in their educational environment. Higher Education, 22, 267–274.

    Article  Google Scholar 

  • Pellegrino, J.E. & Glaser, R. (1979). Cognitive correlates and components in the analysis of individual differences. Intelligence, 3, 187–214.

    Article  Google Scholar 

  • Schoenfeld, A.H. (1985). Mathematical problem solving. San Diego, CA: Academic Press.

    Google Scholar 

  • Segers, M.S.R. (1996, September). An alternative for assessing problem solving skills: the OverAll Test. Paper presented at the Northumbria Assessment Conference, Newcastle, UK.

  • Segers, M.S.R., Tempelaar, D., Keizer, P., Schijns, J., Vaessen, E. & Van Mourik, A. (1991). De overall-toets: een eerste experiment met een nieuwe toets vorm [The OverAll Test: a first experiment with a novel assessment instrument]. Maastricht, The Netherlands: University of Limburg.

    Google Scholar 

  • Segers, M.S.R., Tempelaar, D., Keizer, P., Schijns, J., Vaessen, E. & Van Mourik, A. (1992). De overall-toets: een tweede experiment met een nieuwe toetsvorm [The OverAll Test: a second experiment with a novel assessment instrument]. Maastricht, The Netherlands: University of Limburg.

    Google Scholar 

  • Shahabudin, S.H. (1987). Content coverage in problem-based learning. Medical Education, 21, 310–313.

    Article  Google Scholar 

  • Shavelson, R.J. (1974). Methods for examining representations of a subject-matter structure in a student's memory. Journal of Research in Science Teaching, 11, 231–249.

    Google Scholar 

  • Shavelson, R.J. (1994). Guest editor preface. International Journal of Educational Research, 21, 235–237.

    Article  Google Scholar 

  • Shavelson, R.J., Gao, X. & Baxter, G.P. (1995). On the content validity of performance assessments: centrality of domain specification. In M. Birenbaum & F.J.R.C. Dochy (Eds.), Alternatives in assessment of achievements, learning processes and prior learning (pp. 131–143). Boston, MA: Kluwer.

    Google Scholar 

  • Smith, M.U. (Ed.). (1991). Toward a unified theory of problem solving: views from the content domains. Hillsdale, NJ: Lawrence Erlbaum.

    Google Scholar 

  • Spilich, G.J., Vesonder, G.T., Chiesi, H.L. & Voss, J.F. (1979). Text processing of domain-related information for individuals with high and low domain knowledge. Journal of Verbal Learning and Verbal Behaviors, 18, 275–290.

    Article  Google Scholar 

  • Swanson, D.B., Case, S.N. & van der Vleuten, C.P.M. (1991). Strategies for student assessment. In D. Boud & G. Feletti (Eds.), The challenge of problem-based learning (pp. 260–274). London: Kogan Page.

    Google Scholar 

  • Wolf, D., Bixby, J., Glenn, J. & Gardner, H. (1991). To use their minds well: investigating new forms of student assessment. Review of Research in Education, 17, 31–74.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Segers, M., Dochy, F. & De Corte, E. Assessment Practices and Students Knowledge Profiles in a Problem-based Curriculum. Learning Environments Research 2, 191–213 (1999). https://doi.org/10.1023/A:1009932125947

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/A:1009932125947

Navigation