Elsevier

Computers & Education

Volume 119, April 2018, Pages 44-58
Computers & Education

Exploring communities of inquiry in Massive Open Online Courses

https://doi.org/10.1016/j.compedu.2017.11.010Get rights and content

Highlights

  • We evaluated the Community of Inquiry (CoI) instrument within the MOOC context.

  • Results indicated a good fit of the original three-factor structure of CoI instrument.

  • Results also indicated the optimal fit of the six-factor model.

  • Additional factors captured the unique characteristics of the MOOC setting.

  • We provide an updated CoI model which emphasizes the specifics of the MOOC context.

Abstract

This study presents an evaluation of the Community of Inquiry (CoI) survey instrument developed by Arbaugh et al. (2008) within the context of Massive Open Online Courses (MOOCs). The study reports the results of a reliability analysis and exploratory factor analysis of the CoI survey instrument using the data of 1487 students from five MOOC courses. The findings confirmed the reliability and validity of the CoI survey instrument for the assessment of the key dimensions of the CoI model: teaching presence, social presence, and cognitive presence. Although the CoI survey instrument captured the same latent constructs within the MOOC context as in the Garrison's three-factor model (Garrison et al., 1999), analyses suggested a six-factor model with additional three factors as a better fit to the data. These additional factors were 1) course organization and design (a sub-component of teaching presence), 2) group affectivity (a sub-component of social presence), and 3) resolution phase of inquiry learning (a sub-component of cognitive presence). The emergence of these additional factors revealed that the discrepancies between the dynamics of the traditional online courses and MOOCs affect the student perceptions of the three CoI presences. Based on the results of our analysis, we provide an update to the famous CoI model which captures the distinctive characteristics of the CoI model within the MOOC setting. The results of the study and their implications are further discussed.

Introduction

The growing interest in MOOCs and online education more broadly has been fueled by various social, economic, and political factors that have converged to emphasize the growing societal need for an accessible and sustainable higher education. Some of the factors include concerns surrounding student debt (Matthews, 2013), increasing requirements for lifelong learning to sustain future employment opportunities (Fini, 2009), and an overall need to provide more accessible and democratized models of higher education (Siemens, 2013). While MOOCs have brought online learning to the center of public interest (Gašević et al., 2014, Kovanović et al., 2015b), their development has not been without its challenges.

A particularly significant challenge associated with the MOOC development relates to the present state of MOOC pedagogical designs and the disconnect with the current state of research in online and distance education. MOOCs were originally developed by researchers in online education as an experimentation platform for novel online pedagogical approaches based on the connectivist learning theory (Siemens, 2005), that emphasized the distributed course organization and self-directed student learning. As indicated by Rodriguez (2012), this form of MOOCs is now commonly known as connectivist MOOCs or cMOOCs. A prevalent group of current MOOCs, also known as xMOOCs (Rodriguez, 2012), have tended to adopt a learning design structured around the pre-recorded video lectures, automated assignments, and quizzes with limited direct teaching interaction undertaken by the instructor. This model of design and teaching is selected for its capacity to scale content and learning activities to a large number of students while diminishing the constraints associated with the need for instructors to engage with individual learners (Ng & Widom, 2014).

The present models of MOOC pedagogical design are essentially focused on the transmission of content. This approach represents a radical departure from contemporary distance education practice that is grounded in social constructivist models of learning (Anderson & Dron, 2010). These models assume that students – rather than assimilating predefined knowledge – actively construct their knowledge through a series of interactions with learning content, instructors, and other students. This knowledge construction process is dependent also on their existing knowledge and experiences, meta-cognitive processes, and a particular learning context. By following the behaviorist notion of learning, the dominating MOOC design arguably represents a step back in the quality and richness of online instruction (Bali, 2014, Stacey, 2013). A plausible rationale for this disconnect lies in the multidisciplinary nature of MOOC and online learning research and the strong fragmentation of the MOOC research community to researchers from the field of education and researchers from the field of computer science (Gašević et al., 2014). With researchers from computer science and engineering fields often following a theory-agnostic philosophy of data analysis (Chris, Chris, & Science, 2008), the departure from the contemporary learning theories is not surprising. The disconnect with the previous line of research in online and distance education may also explain the enthusiasm of the early xMOOCs proponents. Although being dubbed a “revolution” (Friedman, 2012) and “tsunami” (Hennessy, 2012) in the field of education, they represent a logical “evolutionary” step in the development of online and distance learning (Bali, 2014, Daniel, 2014).

This paper presents the results of a study examining the use of the contemporary social constructivist models of online and distance education within the MOOC context. The focus of the analysis is on the Community of Inquiry (CoI) model (Garrison et al., 1999), a well-known and one of the most widely-adopted models of distance education (Garrison & Arbaugh, 2007). The CoI model outlines critical dimensions which shape students’ online learning experience and also provides a survey instrument used for their assessment (Arbaugh et al., 2008). This paper examines if the CoI survey instrument can be used to evaluate the interactions in MOOC courses. Given the many pedagogical differences between MOOCs and “traditional,” small-scale online-courses, a re-validation of the existing CoI survey instrument and its factor structure was conducted using the data of 1487 students from five MOOC courses. By examining the CoI model of online learning within the MOOC context, we aim to bridge the gap between research in online learning and current MOOC pedagogical practices and to enable its use for assessment of the quality of MOOC learning experience. The results of our analyses and the broader theoretical and practical implications are further discussed.

Section snippets

Overview of the community of inquiry model

The Community of Inquiry (CoI) (Garrison, Anderson, & Archer, 1999) framework is a widely adopted pedagogical model that outlines the critical dimensions that shape a students’ online learning experience. Rooted in the constructivist notions of learning Dewey (1933) and the work of Lipman (1991), the CoI model (Fig. 1) focuses on the development of higher-order thinking through inquiry-based learning in a learning community. In this context, learning community is defined as “a group of

Research questions

While there has been substantial work on the validation of the CoI instrument, the primary context was traditional, formal education, with data coming from the small-scale, for-credit online courses. However, to our knowledge, the use of CoI model and validation of its survey instrument have not been examined within the MOOC context. Given the rapidly emerging MOOC research, as well the broad adoption of the CoI model within traditional online settings, the goal of the present study is to

Study data

The data for this study was collected from five different MOOCs offered by the Delft University of Technology in the Netherlands on edX platform during the Fall 2014 term (Table 1). The courses included a range of learning activities such as recorded video materials, reading materials, short multiple-choice quizzes, homework assignments, and online forum discussions (see Hennis, Topolovec, Poquet, & de Vries, 2016). Students who successfully completed a course were issued a course completion

RQ1: reliability analysis results

To validate the CoI survey instrument in the MOOC context, we examined the reliability of the CoI instrument using Cronbach's alpha measure (Cronbach, 1951). All three subscales obtained overall reliability scores of 0.89 or above (Table 2) which indicates a reliable measurement instrument (Kline, 1999). We can also see that none of the items on all three subscales had an alpha value higher than the overall alpha value, indicating that none of the items negatively affects instrument

RQ1: reliability of the CoI instrument in the MOOC context

The results from the reliability analysis confirmed that the use of the CoI survey instrument within the MOOC context is internally consistent. The obtained Cronbach's α values for the three subscales were just slightly lower than the ones in the existing research (Swan et al., 2008) and still sufficiently above the 0.8 level which is often used in the literature (Kline, 1999). Similar to the previous studies (Díaz et al., 2010, Garrison et al., 2010b, Shea and Bidjerano, 2009, Swan et al., 2008

Conclusions

In this paper, we evaluated the use of the CoI survey instrument within the MOOC context. Through the exploratory factor analysis of the data (N=1,487) from five MOOCs, we examined whether the differences between traditional small-scale online courses, for which the CoI survey was initially designed, and MOOCs affect the reliability and validity of the CoI survey instrument. First of all, our results indicate that Community of Inquiry survey instrument is a reliable and valid tool for measuring

References (71)

  • P. Shea et al.

    Community of inquiry as a theoretical framework to foster “epistemic engagement” and “cognitive presence” in online education

    Computers & Education

    (2009)
  • P. Shea et al.

    A study of teaching presence and student sense of learning community in fully online and web-enhanced college courses

    The Internet and Higher Education

    (2006)
  • K. Swan et al.

    The community of inquiry framework ten years later: Introduction to the special issue

    The Internet and Higher Education

    (2010)
  • Z. Akyol et al.

    The development of a community of inquiry over time in an online Course: Understanding the progression and integration of social, cognitive and teaching presence

    Journal of Asynchronous Learning Networks

    (2008)
  • Z. Akyol et al.

    The impact of course duration on the development of a community of inquiry

    Interactive Learning Environments

    (2011)
  • T. Anderson et al.

    Three generations of distance education pedagogy

    The International Review of Research in Open and Distance Learning

    (2010)
  • T. Anderson et al.

    Assessing teaching presence in a computer conferencing context

    Journal of Asynchronous Learning Networks

    (2001)
  • J.B. Arbaugh

    An empirical verification of the community of inquiry framework

    Journal of Asynchronous Learning Networks

    (2007)
  • M. Bali

    MOOC Pedagogy: Gleaning good practice from existing MOOCs

    Journal of Online Learning and Teaching

    (2014)
  • R.B. Cattell

    The scree test for the number of factors

    Multivariate Behavioral Research

    (1966)
  • P. Celentin

    Online training: Analysis of interaction and knowledge building patterns among foreign language teachers

    Journal of Distance Education

    (2007)
  • A. Chris et al.

    The end of Theory: The data deluge makes the scientific method obsolete

    (2008)
  • A.L. Comrey

    A first course in factor analysis

    (1973)
  • L.J. Cronbach

    Coefficient alpha and the internal structure of tests

    Psychometrika

    (1951)
  • C.A.V. Damm

    Applying a community of inquiry instrument to measure student engagement in large online courses

    Current Issues in Emerging ELearning

    (2016)
  • J. Daniel

    Foreword to the special section on massive open online courses MOOCs – evolution or revolution?

    Journal of Online Learning and Teaching

    (2014)
  • J. Dewey

    How we think. A restatement of the relation of reflective thinking to the educative process. D. C

    (1933)
  • B. Everitt

    Cambridge dictionary of statistics

    (2002)
  • J.L. Fava et al.

    The effects of underextraction in factor and component analyses

    Educational and Psychological Measurement

    (1996)
  • A.P. Field et al.

    Discovering statistics using R

    (2012)
  • A. Fini

    The technological dimension of a massive open online Course: The case of the CCK08 course tools

    The International Review of Research in Open and Distance Learning

    (2009)
  • T.L. Friedman

    Come the revolution

    (2012)
  • D.R. Garrison

    E-Learning in the 21st century: A framework for research and practice

    (2011)
  • D.R. Garrison et al.

    Critical thinking, cognitive presence, and computer conferencing in distance education

    American Journal of Distance Education

    (2001)
  • D. Gašević et al.

    Where is research on massive open online courses headed? A data analysis of the MOOC Research Initiative

    The International Review of Research in Open and Distributed Learning

    (2014)
  • Cited by (66)

    • Key factors in MOOC pedagogy based on NLP sentiment analysis of learner reviews: What makes a hit

      2022, Computers and Education
      Citation Excerpt :

      Other studies focus on appraising the CoI framework within the MOOC context and explored additional factors to refine the framework configuration. For example, Kovanović et al. (2018) suggested adding three factors through a survey from five MOOCs, including course design, group affectivity, and resolution phase of inquiry learning to Teaching Presence, Social Presence, and Cognitive Presence, respectively. Kaul, Aksela, and Wu (2018) examined the discussion forum posts from one MOOC and suggested that the CoI framework needs to capture more nuanced factors, such as the effects of course design and the engagement of MOOC learners.

    • Online vs. on-campus higher education: Exploring innovation in students' self-reports and students' learning products

      2021, Thinking Skills and Creativity
      Citation Excerpt :

      According to Wang and Woo (2007), it is difficult to imitate face-to-face (F2F) interaction with asynchronous communication, mostly due to the lack of immediate feedback from the instructor or from other learners. Indeed, recent studies reported that online asynchronous courses might pose significant challenges for open communication, group cohesion and the sense of belonging to an online learning community (Boling, Hough, Krinsky, Saleem & Stevens, 2012; Kovanović et al., 2018; Watson et al., 2016). Unlike their F2F counterparts, asynchronous online students are less exposed to non-verbal communication (such as body language, facial expressions, hand gestures and so on), and they have fewer opportunities to socially interact with each other (Usher & Barak, 2018; Koszalka, Pavlov, & Wu, 2021; Saghafian & O'Neill, 2018).

    View all citing articles on Scopus
    View full text