Skip to main content
Log in

Using multidimensional Rasch analysis to validate the Chinese version of the Motivated Strategies for Learning Questionnaire (MSLQ-CV)

  • Published:
European Journal of Psychology of Education Aims and scope Submit manuscript

Abstract

This article used the multidimensional random coefficients multinomial logit model to examine the construct validity and detect the substantial differential item functioning (DIF) of the Chinese version of motivated strategies for learning questionnaire (MSLQ-CV). A total of 1,354 Hong Kong junior high school students were administered the MSLQ-CV. Partial credit model was suggested to have a better goodness of fit than that of the rating scale model. Five items with substantial gender or grade DIF were removed from the questionnaire, and the correlations between the subscales indicated that factors of cognitive strategy use and self-regulation had a very high correlation which resulted in a possible combination of the two factors. The test reliability analysis showed that the subscale of test anxiety had a lower reliability compared with the other factors. Finally, the item difficulty and step parameters for the modified 39-item questionnaire were displayed. The order of the step difficulty estimates for some items implied that some grouping of categories might be required in the case of overlapping. Based on these findings, the directions for future research were discussed.

Résumé

Cet article a utilisé le modèle Coefficients aléatoire Multidimensionnel Multinomial Logit (MRCMLM) pour examiner la validité de construction et détecter les articles DIF substantielles de la version chinoise de la stratégies motivés pour la lecture de la questionnaire (MSLQ-CV). Un total de 1,354 élèves du lycées de Hong Kong, ont été administrés dans MSLQ-CV. Un modèle de crédit partiel a été suggéré pour avoir une meilleure qualité d'ajustement que le modèle d'échelle de classement. Cinq éléments avec une des sexes ou grade DIF substantiels ont été retirés du questionnaire, et les corrélations entre les sous-échelles qui indiquent que les facteurs de l'utilisation de la stratégie cognitifs et l'autorégulation avaient une corrélation très élevée, qui a entraîné une combinaison possible de deux facteurs. L'analyse de fiabilité de test a montré que la sous-échelle de test anxiété avait une fiabilité plus faible par rapport aux autres facteurs. Enfin, les paramètres de difficulté et l'étape d'article pour le questionnaire de 39-article modifié étaient affichées. L'ordre des estimations de difficulté étape pour certains articles implicite que certains regroupement des catégories peuvent être nécessaires dans le cas de chevauchement. Se fondant sur ces constatations, les directions pour de futures recherches ont été discutées.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  • Adams, R. J., & Wilson, M. R. (1996). Formulating the Rasch model as a mixed coefficients multinomial logit. In G. Englhard & M. Wilson (Eds.), Objective measurement: Theory into practice (Vol. 3, pp. 143–166). Norwood: Albex.

    Google Scholar 

  • Adams, R. J., Wilson, M. R., & Wang, W.-C. (1997). The multidimensional random coefficients multinomial logit model. Applied Psychological Measurement, 21, 1–23.

    Article  Google Scholar 

  • Andrich, D. (1978). A rating scale formulation for ordered response categories. Psychometrika, 43, 561–573.

    Article  Google Scholar 

  • Cheng, Y.-Y., Wang, W.-C., & Ho, Y.-H. (2009). Multidimensional Rasch analysis of a psychological test with multiple subtests: A statistical solution for the bandwidth-fidelity dilemma. Educational and Psychological Measurement, 69, 369–388.

    Article  Google Scholar 

  • Chien, T.-W., Hsu, S.-Y., Tai, C., Guo, H.-R., & Su, S.-B. (2008). Using Rasch analysis to validate the revised PSQI to assess sleep disorders in Taiwan’s hi-tech workers. Community Mental Health Journal, 44, 417–425.

    Article  Google Scholar 

  • Cohen, A. S., Kim, S.-H., & Wollack, J. A. (1996). An investigation of the likelihood ratio test for detection of differential item functioning. Applied Psychological Measurement, 20, 15–26.

    Article  Google Scholar 

  • Dodd, B. G., & Koch, W. R. (1987). Effects of variations in item step values on item and test information in the partial credit model. Applied Psychological Measurement, 11, 371–384.

    Article  Google Scholar 

  • Duncan, T. G., & McKeachie, W. J. (2005). The making of the motivated strategies for learning questionnaire. Educational Psychologist, 40, 117–128.

    Article  Google Scholar 

  • Embretson, S. E., & Reise, S. P. (2000). Item response theory for psychologist. Mahwah: Erlbaum.

    Google Scholar 

  • Holland, P. W., & Thayer, D. T. (1988). Differential item performance and the Mantel–Haenszel procedure. In H. Wainer & H. I. Braun (Eds.), Test validity (pp. 129–145). Hillsdale: Erlbaum.

    Google Scholar 

  • Liu, I. M. (1986). Chinese cognition. In M. H. Bond (Ed.), The psychology of the Chinese people. Hong Kong: Oxford University Press.

    Google Scholar 

  • Marton, F., Watkins, D., & Tang, C. (1997). Discontinuities and continuities in the experience of learning: An interview study of high-school students in Hong Kong. Learning and Instruction, 7, 21–48.

    Article  Google Scholar 

  • Masters, G. N. (1982). A Rasch model for partial credit scoring. Psychometrika, 47, 149–174.

    Article  Google Scholar 

  • Pesudovs, K., & Noble, B. A. (2005). Improving subjective scaling of pain using Rasch analysis. The Journal of Pain, 6, 630–636.

    Article  Google Scholar 

  • Pintrich, P. R., & De Groot, E. V. (1990). Motivational and self-regulated learning components of classroom academic performance. Journal of Educational Psychology, 82, 33–40.

    Article  Google Scholar 

  • Pintrich, P. R., Smith, D. A. F., Garcia, T., & Mckeachie, W. J. (1993). Reliability and predictive validity of the Motivated Strategies for Learning Questionnaire (MSLQ). Educational and Psychological Measurement, 53, 801–813.

    Article  Google Scholar 

  • Pintrich, P. R., Roeser, R. W., & De Groot, E. A. M. (1994). Classroom and individual differences in early adolescents’ motivation and self-regulated learning. Journal of Early Adolescence, 14, 139–161.

    Article  Google Scholar 

  • Rao, N., & Sachs, J. (1999). Confirmatory factor analysis of the Chinese version of the motivated strategies for learning questionnaire. Educational and Psychological Measurement, 59, 1016–1029.

    Article  Google Scholar 

  • Rao, N., Morley, B. E., & Sachs, J. (2000). Motivational beliefs, study strategies, and mathematics attainment in high- and low-achieving Chinese secondary school students. Contemporary Educational Psychology, 25, 287–316.

    Article  Google Scholar 

  • Rogers, H. J., & Swaminathan, H. (1993). A comparison of the logistic regression and Mantel–Haenszel procedures for detecting differential item functioning. Applied Psychological Measurement, 17, 105–116.

    Article  Google Scholar 

  • Sachs, J., Law, Y. K., Chan, C. K. K., & Rao, N. (2001). A nonparametric item analysis of the motivated strategies for learning questionnaire—Chinese version. Psychologia, 44, 197–208.

    Article  Google Scholar 

  • Sachs, J., Law, Y. K., & Chan, C. K. K. (2002). An analysis of the relationship between the motivated strategies for learning questionnaire and the learning process questionnaire. Psychologia, 45, 193–203.

    Article  Google Scholar 

  • Swaminathan, H., & Rogers, H. J. (1990). Detecting differential item functioning using logistic regression procedures. Journal of Educational Measurement, 27, 361–370.

    Article  Google Scholar 

  • Wang, W.-C., & Wilson, M. R. (2005). Assessment of differential item functioning in testlet-based items using the Rasch testlet model. Educational and Psychological Measurement, 65, 549–576.

    Article  Google Scholar 

  • Wang, W.-C., Chen, P.-H., & Cheng, Y.-Y. (2004). Improving measurement precision of test batteries using multidimensional item response models. Psychological Methods, 9, 116–136.

    Article  Google Scholar 

  • Wang, W.-C., Yao, G., Tsai, Y.-J., Wang, J.-D., & Hsieh, C.-L. (2006). Validating, improving reliability, and estimating correlation of the four subscales in the WHOQOL-BREF using multidimensional Rasch analysis. Quality of Life Research, 15, 607–620.

    Article  Google Scholar 

  • Watkins, D. A., & Biggs, J. B. (2001). The paradox of the Chinese learner and beyond. In D. A. Watkins & J. B. Biggs (Eds.), Teaching the Chinese learner: Psychological and pedagogical perspectives (pp. 3–23). Hong Kong: Comparative Education Research Center.

    Google Scholar 

  • Wright, B. D., Linacre, J. M., Gustafson, J. E., & Martin-Lof, P. (1994). Reasonable mean-square fit values. Rasch Measurement Transactions, 8, 370.

    Google Scholar 

  • Wu, M. L., Adams, R. J., & Wilson, M. R. (1998). ConQuest [Computer software and manual]. Camberwell: Australian Council for Educational Research.

    Google Scholar 

  • Yao, L., & Boughton, K. A. (2007). A multidimensional item response modeling approach for improving subscale proficiency estimation and classification. Applied Psychological Measurement, 31, 83–105.

    Article  Google Scholar 

  • Yao, L., & Schwarz, R. D. (2006). A multidimensional partial credit model with associated item and test statistics: An application to mixed-format test. Applied Psychological Measurement, 30, 469–492.

    Article  Google Scholar 

Download references

Acknowledgments

This research was supported by the Partnership for Improvement of Learning and Teaching (PILT) project sponsored by the Education Bureau, Hong Kong. The authors thank the Centre for University and School Partnership, The Chinese University of Hong Kong, for its help in conducting this research.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhonghua Zhang.

Additional information

John Chi-Kin Lee. Professor, Department of Curriculum and Instruction, Faculty of Education, The Chinese University of Hong Kong, Sha Tin, NT, Hong Kong. Email: jcklee@cuhk.edu.hk.

Current themes of research:

Curriculum development and curriculum reform. Teaching and learning. Environmental education and geographical education.

Most relevant publications in the field of Psychology of Education:

Lau, D. K. L., & Lee, J. C. K. (2008). Validation of a Chinese achievement goal orientation questionnaire. British Journal of Educational Psychology, 78, 331–353.

Lau, D. K. L., & Lee, J. (2008). Examining Hong Kong students’ achievement goals and their relations with students’ perceived classroom environment and strategy use. Educational Psychology, 28, 357–372.

Lee, J. C. K., Lee, L. M. F., & Wong, H. W. (2003). Development of a classroom environment scale in Hong Kong. Educational Research and Evaluation, 9, 317–344.

Zhonghua Zhang. Department of Educational Psychology, Faculty of Education, The Chinese University of Hong Kong, Sha Tin, NT, Hong Kong. Email: chonghuachang@cuhk.edu.hk.

Current themes of research:

Test equating. Differential item functioning. Application of advanced measurement models and psychometric methods in educational and psychological researches.

Hongbiao Yin. Department of Curriculum and Instruction, Faculty of Education, The Chinese University of Hong Kong, Sha Tin, NT, Hong Kong. Email: yinhb@cuhk.edu.hk.

Current themes of research:

Teacher emotion. Learning environments. Curriculum change and implementation.

Appendix

Appendix

Motivated Strategies for Learning Questionnaire—Chinese Version (MSLQ-CV)

Self-efficacy

2. Compared to other students in this class I expect to do well.

6. I am certain that I can understand the ideas taught in my classes.

8. I expect to do very well in school.

9. Compared with others in this class, I think I am a good student.

11. I am sure I can do an excellent job on the class assignments and homework.

13. I think I will receive good grades in my exams.

16. My study skills are excellent compared with others in this class.

18. Compared with other students in this class I think I know a great deal about the subjects I am studying.

19. I know that I will be able to learn the materials for the tests and exams.

Intrinsic value

1. I prefer class work that is challenging so I can learn new things.

4. It is important for me to learn what is being taught in school.

5. I like what I am learning in school.

7. I think I will be able to use what I learn in one subject in another.

10. I often do more than is required of me for homework assignments.

14. Even when I do poorly on a test or exam I try to learn from my mistakes.

15. I think that what I am learning in school is useful for me to know.

17. I think that what we are learning in school is interesting.

21. Understanding the subject is important to me.

Test anxiety

3. I am so nervous during a test that I cannot remember facts that I have learned.

12. I have an uneasy, upset feeling when I take a test or exam.

20. I worry a great deal about tests and exams.

22. When I take a test I think about how poorly I am doing.

Cognitive strategy use

23. When I study for a test, I try to put together the information from class and from the textbook.

24. When I do homework, I try to remember what the teacher said in class so I can answer the question correctly.

28. When I study I put important ideas into my own words.

29. I always try to understand what the teacher is saying even if it doesn’t make sense.

30. When I study for a test I try to remember as many facts as I can.

31. When studying, I copy my notes over to help me remember materials.

34. When I study for a test I practice saying the important facts over and over to myself.

35. Before I begin studying I think about the things I will need to do to learn.

39. When I am studying a topic, I try to make everything fit together.

41. When I read material for my classes, I say the words over and over to myself to help me remember.

42. I outline the chapters in my book to help me study.

44. When I am studying I try to connect the things I am reading about with what I already know.

Self-regulation

25. I ask myself questions to make sure I know the material I have been studying.

26. It is not difficult for me to decide what the main ideas are when I study.

27. Although work is hard, I neither give up nor study the easy part.

32. I work on practice exercises and answer end of chapter questions even when I don’t have to.

33. Even when study materials are dull and uninteresting, I keep working until I finish.

36. I use what I have learned from old homework assignments and the textbook to do new assignments.

37. The materials I use for studying are not difficult to understand for me.

38. When teacher is talking, I pay attention to what is being said rather than think of other things.

40. When I am studying I stop once in a while and go over what I have read.

43. I work hard to get a good grade even when I do not like a class.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Lee, J.CK., Zhang, Z. & Yin, H. Using multidimensional Rasch analysis to validate the Chinese version of the Motivated Strategies for Learning Questionnaire (MSLQ-CV). Eur J Psychol Educ 25, 141–155 (2010). https://doi.org/10.1007/s10212-009-0009-6

Download citation

  • Received:

  • Revised:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10212-009-0009-6

Keywords

Navigation