Confirmatory Factor Analysis on the Sub-Construct of Teaching Presence’s in the Community of Inquiry

Abstract

This study aims to re-examine the reliability and validity of three sub-constructs in measuring the level of teaching presence from one of the essential elements in the Community of Inquiry model. The measurement consists of 13 items which are online instructor capability; design and organization, facilitation, and direct instruction. A total of 1938 respondents from a faculty in a public university in Malaysia were selected in the data collection. A cross-sectional survey was applied via online survey and partial least technique was used in analyzing the data. All items were found loaded (0.746 or higher) and all constructs measuring teaching presence had high composite reliability (0.876 or higher) and average variance extracted (0.640 or higher). Thus, a multivariate statistical analysis confirmed the validity and reliability of all items.

Share and Cite:

Nasir, M. , Surat, S. , Maat, S. , Karim, A. and Daud, M. (2018) Confirmatory Factor Analysis on the Sub-Construct of Teaching Presence’s in the Community of Inquiry. Creative Education, 9, 2245-2253. doi: 10.4236/ce.2018.914165.

1. Introduction

Teaching and learning online is not as easy as we think. It needs additional skills and effort in order to sustain continuous engagement in virtual learning environment. The presence of an online instructor is necessary to ensure the learning takes place like traditional face-to-face approach. Malaysia Education Online in part of the Malaysia Government Transformation Plan is expanding online learning nationwide; resulting nearly all institutions of higher education offering and converting their courses or programs online (Malaysian Ministry of Higher Education [MMOHE], 2015) . The online platform can be in any format; whether blended or fully online, license or open source like Massive Open Online Courses (MOOC).

In line with the growth of Malaysia MOOC, there are over 612 courses with the enrollment approximately 400 thousand students around the globe (Open Learning Global, 2018) . The massive growth of this phenomenon alerts researchers in the field to investigate the acceptance and the effectiveness of learning online due to the issues of students’ feeling of isolation, boredom, and withdrawal from course (Baharudin, Nasir, Yusoff, & Surat, 2018; Bowers & Kumar, 2015; Khalid, 2014; Khalid & Quick, 2016; Rovai & Downey, 2010) , and dissatisfaction (Khalid & Quick, 2016; Rovai & Downey, 2010; Sorden & Munene, 2013) . There are models commonly used like Technology acceptance model (TAM), and Unified theory of Acceptance and Use of Technology (UTAUT) to measure the level of acceptance and perception toward technology.

Nonetheless, very rare the model of Community of Inquiry (CoI) is applied in Malaysia to investigate the interaction and the ability of instructor to facilitate their e-learning courses alive and active (Baharudin et al., 2018; Khalid & Quick, 2016) . Thus, the aim of this paper is to re-examine the reliability and validity of three sub-constructs in measuring the online instructor’s level of teaching presence. As known by scholar in the distance education field, teaching presence is one of the essential elements in the CoI framework that needs to be taken into account when offering online program. However, a hypothesis testing of any constructs was beyond the scope of this study.

2. Related Research

2.1. Community of Inquiry

The Community of Inquiry (CoI) model comprises three essential overlapping elements of constructive learning experience virtually; teaching presence, social presence, and cognitive presence (Garrison, Anderson, & Archer, 1999; Garrison, Cleveland-Innes, & Fung, 2010) . Principally, teaching presence is about online instructor, while social presence is about learner’s peer engagement in online environment includes affective expression, open communication, and group cohesion (Garrison et al., 1999; Garrison, Cleveland-Innes, & Fung, 2010) . Another element is cognitive presence, which refers to any content posted in the virtual classroom; it triggers the learning event, exploration, integration, and resolution in learning certain topic or issue (Garrison et al., 1999; Garrison, Cleveland-Innes, & Fung, 2010) . In brief, the CoI instrument that has been verified to establish its reliability and validity measurement for those three presences (Salloum, 2011; Yu & Richardson, 2015; Zimmerman & Nimon, 2017) . Therefore, teaching presence will be the focus throughout the discussion.

2.2. Teaching Presence

Teaching presence is the ability and effort spent by online instructor to design, organize, facilitate, and direct teaching virtually (Bowers & Kumar, 2015; Garrison et al., 1999; Garrison, Cleveland-Innes, & Fung, 2010) . It is an interaction between instructor and student that involves providing guideline and motivation in achieving worthwhile learning outcome as highlighted by (Garrison et al., 1999; Moore, 1989) . Issues such as lack of immediacy feedback, unsupportive, poor participation by online instructor as noticed by Baharudin et al. (2018) ; Khalid (2014); Khalid & Quick (2016) are crucial which need to be taken into account.

Watson, Watson, Janakiraman, & Richardson (2017) in their case study on the teaching presence involving six instructors reviewing the course syllabus, instructional activities, materials, announcements, and discussion posted, found that teaching presence was repeatedly recorded. Another reviewed empirical literature by Croxton (2014) over the lens of Bandura’s social cognitive theory, Anderson’s interaction equivalency theorem, and Tinto’s social integration theory concerning presence, noticed that teaching presence and online pedagogical skills is in line with Spears (2012) that instructor’s presence is vital in sustaining student’s engagement in learning.

Another previous study found the same agreement on the concept of teaching presence (Battalio, 2007; Kanuka, Collett, & Caswell, 2002; Moore, 1989) . Consistently, in the CoI framework, teaching presence is viewed and measured based on three sub-constructs as mentioned earlier and are summarized in Table 1 below.

3. Methodology

A cross-sectional survey was employed via online survey and the partial least technique was used in analyzing the data. The data collected were purely quantitative coming from students in a faculty, at a public university in Malaysia where the university is rapidly implementing blended learning via its own platform. Apart from face-to-face, all courses are highly requested by the university to be conducted in blended form.

3.1. Respondents

A total number of 1938 of hybrid students who enrolled on 34 blended courses in a particular semester were selected for this study which include undergraduate

Table 1. Sub-construct and meaning in the teaching presence.

Note. Adapted from “Researching the Community of Inquiry Framework: Review, Issues, and Future Directions,” by D. R. Garrison & J. B. Arbaugh, 2007, The Internet and Higher Education, 10(3), p. 159.

and postgraduate. Purposive sampling method was used in this study where respondents were selected based on the blended learning report of the usage of a university learning portal. Any courses achieved the minimum blended requirement as stated in Dasar e-Pembelajaran Negara (DePAN) at least uploaded or posted; 1) seven types of course materials in the proforma (syllabus) and/or the course synopsis; 2) three activities or posts; 3) two assignments were selected.

3.2. Instrument

The structured of CoI cross-sectional online survey questionnaires was distributed among respondents via google form. The CoI instrument was adopted from (Garrison et al., 1999; Garrison, Anderson, & Archer, 2010) which already established its own validity and reliability for more than a decade ago and has been translated and tested from various countries (Garrison, Anderson, & Archer, 2010; Swan, 2001; Yu & Richardson, 2015; Zimmerman & Nimon, 2017) . As the study only interested in teaching element, therefore only 14 teaching presence items were measured using 6-point Likert scale (almost never true = 1 and almost always true = 6). The students were invited and volunteered to participate in the study and could access the survey link via email provided by the faculty record of enrollment.

3.3. Measurement Model

The measurement model conceptualized the sub-construct of design and organization, facilitation and direct instruction as a first-order reflective construct. Teaching presence is a formative second-order construct is illustrated in Figure 1. Therefore, at this stage of the study, none hypotheses will be tested. It merely focuses on convergent and discriminant validity of the measurement scales.

Figure 1. The measurement model.

4. Results

Smart PLS version 3.2.7 was used as a data analysis software in this study. It is a variance based Structural Equation Modelling (SEM); a multivariate approach which has the ability to demonstrate in different angle of result as opposed to univariate calculation (Hair, Hult, Ringle, & Sarstedt, 2017) . Returned responses were 686 out of 1938 distributed survey. Consequently, after the cleaning of non-response items, incomplete, and skewness, only 218 usable and completed responses used for calculation were obtained, representing 11.25% response rate of the survey. The rate is still within the acceptable range of online survey (Nulty, 2008) as compared to a normal survey. Mardia’s multivariate skewness and kurtosis were calculated and supported the decision to use Smart PLS (Hair et al., 2017) .

Validity and Reliability

The loadings, Average Variance Extracted (Kanuka et al., 2002) and Composite Reliability (CR) were found higher than the minimum requirement set by Hair et al. (2017) . This shows that convergent validity of the measurement scales is not the issue in this study. Additionally, it shows an evidence that all items measured the same concept in agreement by empirical standards (Hair et al., 2017) as shown in Table 2 and visualize in Figure 2.

In terms of discriminant validity, Table 3 represents the square root of the AVE while the off-diagonals represent correlation which met the criterion (Fornell & Larcker, 1981) . Hence, all value fulfils the criterion and discriminant validity has been established in this study.

Table 2. Loading, average variance extracted and composite reliability of measurement scales (N = 218).

Figure 2. The algorithm measurement model.

Table 3. Discriminant validity.

5. Discussion

As noticed in the analysis section of this study, loading, average variance extracted and composite reliability as to measure the convergent validity of the instrument were acceptable and found to be valid and reliable (Hair et al., 2017) . This evidence shows that all items measured the same concept in agreement. Additionally, the degree to which items differentiate among sub-constructs known as the discriminant validity was established. The statistical values in this study confirmed that teaching presence shows very essential and significant element in one of the main concepts in the CoI framework; the ability of online instructor to design, organize, direct instruction, and facilitating learning resulting the consistently of engagement and participation could be measured (Garrison, Anderson, & Archer, 2010; Zimmerman & Nimon, 2017) .

Previous literature from several countries (Cho & Tobias, 2016; Feng, Xie, & Liu, 2017; Setiani & MacKinnon, 2015; Shin & Kang, 2015; Yu & Richardson, 2015; Zimmerman & Nimon, 2017) had verified and found the applicability of the model. In line with this study, it revealed that multivariate analysis also supports the validity and reliability of the items, which could enrich the literature on the CoI. Teaching presence will be more insightful if it is hypothesized to other numerous setting, environment, and culture; on how well online instructor is able to manage their presence online specifically in Malaysia learning environment.

However, this study does not mean to test any hypotheses, but a confirmatory factor analysis to re-investigate the reliability and the validity of items that measure teaching presence. The finding would be more meaningful if the model could be tested to other dependent variables construct (e.g. satisfaction, motivation, and readiness) and consider teaching presence as a formative second-order construct explicitly on the evaluation of the effectiveness online learning in Malaysia.

Acknowledgements

This research is conducted and supported by national grant challenge group (Cabaran Perdana) of Universiti Kebangsaan Malaysia (UKM) [code: DCP-2017-020/3, 2018].

Conflicts of Interest

The authors declare no conflicts of interest.

References

[1] Baharudin, H., Nasir, M. K. M., Yusoff, N. M. R. N., & Surat, S. (2018). Assessing Students’ Course Satisfaction with Online Arabic Language Hybrid Course. Advanced Science Letters, 24, 350-352.
https://doi.org/10.1166/asl.2018.12005
[2] Battalio, J. (2007). Interaction Online: A Reevaluation. Quarterly Review of Distance Education, 8, 341-352.
[3] Bowers, J., & Kumar, P. (2015). Students’ Perceptions of Teaching and Social Presence: A Comparative Analysis of Face-to-Face and Online Learning Environments. International Journal of Web-Based Learning and Teaching Technologies, 10, 27-44.
https://doi.org/10.4018/ijwltt.2015010103
[4] Cho, M.-H., & Tobias, S. (2016). Should Instructors Require Discussion in Online Courses? Effects of Online Discussion on Community of Inquiry, Learner Time, Satisfaction, and Achievement. The International Review of Research in Open and Distributed Learning, 17, 123-140.
[5] Croxton, R. A. (2014). The Role of Interactivity in Student Satisfaction and Persistence in Online Learning. Journal of Online Learning and Teaching, 10, 314.
[6] Feng, X., Xie, J., & Liu, Y. (2017). Using the Community of Inquiry Framework to Scaffold Online Tutoring. The International Review of Research in Open and Distributed Learning, 18, 165-187.
https://doi.org/10.19173/irrodl.v18i2.2362
[7] Fornell, C., & Larcker, D. F. (1981). Structural Equation Models with Unobservable Variables and Measurement Error: Algebra and Statistics. Journal of Marketing Research, 18, 382-388.
https://doi.org/10.2307/3150980
[8] Garrison, D. R., Anderson, T., & Archer, W. (1999). Critical Inquiry in a Text-Based Environment: Computer Conferencing in Higher Education. The Internet and Higher Education, 2, 87-105.
https://doi.org/10.1016/S1096-7516(00)00016-6
[9] Garrison, D. R., Anderson, T., & Archer, W. (2010). The First Decade of the Community of Inquiry Framework: A Retrospective. The Internet and Higher Education, 13, 5-9.
https://doi.org/10.1016/j.iheduc.2009.10.003
[10] Garrison, D. R., Cleveland-Innes, M., & Fung, T. S. (2010). Exploring Causal Relationships among Teaching, Cognitive and Social Presence: Student Perceptions of the Community of Inquiry Framework. The Internet and Higher Education, 13, 31-36.
https://doi.org/10.1016/j.iheduc.2009.10.002
[11] Hair, J. F., Hult, G. T. M., Ringle, C., & Sarstedt, M. (2017). A Primer on Partial Least Squares Structural Equation Modeling (PLS-SEM) (2nd ed.). Thousand Oakes, CA: Sage.
[12] Kanuka, H., Collett, D., & Caswell, C. (2002). University Instructor Perceptions of the Use of Asynchronous Text-Based Discussion in Distance Courses. The American Journal of Distance Education, 16, 151-167.
https://doi.org/10.1207/S15389286AJDE1603_3
[13] Khalid, N. M. (2014). Factors Affecting Course Satisfaction of Online Malaysian University Students. Doctoral Dissertation, Fort Collins, CO: Colorado State University.
[14] Khalid, N. M., & Quick, D. (2016). Teaching Presence Influencing Online Students’ Course Satisfaction at an Institution of Higher Education. International Education Studies, 9, 62.
https://doi.org/10.5539/ies.v9n3p62
[15] Malaysian Ministry of Higher Education [MMOHE]. (2015). Pelan Tindakan Pengajian Tinggi Negara Fasa 2 (2015-2015).
http://www.mohe.gov.my/en/download/awam/penerbitan/pppm-2015-2025-pt/5-malaysia-education-blueprint-2015-2025-higher-education/file
[16] Moore, M. (1989). Three Types of Interaction. The American Journal of Distance Education, 3, 1-6.
https://doi.org/10.1080/08923648909526659
[17] Nulty, D. D. (2008). The Adequacy of Response Rates to Online and Paper Surveys: What Can Be Done? Assessment & Evaluation in Higher Education, 33, 301-314.
https://doi.org/10.1080/02602930701293231
[18] Open Learning Global (2018). Experience Online Learning. The Social Way.
https://www.openlearning.com/malaysiamoocs
[19] Rovai, A. P., & Downey, J. R. (2010). Why Some Distance Education Programs Fail while Others Succeed in a Global Environment. The Internet and Higher Education, 13, 141-147.
https://doi.org/10.1016/j.iheduc.2009.07.001
[20] Salloum, S. R. (2011). Student Perceptions of Computer-Mediated Communication Tools in Online Learning: Helpfulness and Effects on Teaching, Social, and Cognitive Presence. Charlotte: The University of North Carolina.
[21] Setiani, M. Y., & MacKinnon, A. M. (2015). A Community of Inquiry-Based Framework for Civic Education at Universitas Terbuka, Indonesia. Distance Education, 36, 351-363.
https://doi.org/10.1080/01587919.2015.1081740
[22] Shin, W. S., & Kang, M. (2015). The Use of a Mobile Learning Management System at an Online University and Its Effect on Learning Satisfaction and Achievement. The International Review of Research in Open and Distributed Learning, 16, 110-129.
[23] Sorden, S. D., & Munene, I. I. (2013). Constructs Related to Community College Student Satisfaction in Blended Learning. Journal of Information Technology Education: Research, 12, 251-270.
https://doi.org/10.28945/1890
[24] Spears, L. R. (2012). Social Presence, Social Interaction, Collaborative Learning, and Satisfaction in Online and Face-to-Face Courses. Doctoral Dissertation, Ames, IA: Iowa State University.
https://utrgv-ir.tdl.org/utrgv-ir/bitstream/handle/2152.6/621/rodriguez_martin_dissertation_5-19-15.pdf?sequence=1&isAllowed=y
[25] Swan, K. (2001). Virtual Interaction: Design Factors Affecting Student Satisfaction and Perceived Learning in Asynchronous Online Courses. Distance Education, 22, 306-331.
https://doi.org/10.1080/0158791010220208
[26] Watson, S. L., Watson, W. R., Janakiraman, S., & Richardson, J. (2017). A Team of Instructors’ Use of Social Presence, Teaching Presence, and Attitudinal Dissonance Strategies: An Animal Behaviour and Welfare MOOC. The International Review of Research in Open and Distributed Learning, 18, 69-91.
https://doi.org/10.19173/irrodl.v18i2.2663
[27] Yu, T., & Richardson, J. C. (2015). Examining Reliability and Validity of a Korean Version of the Community of Inquiry Instrument Using Exploratory and Confirmatory Factor Analysis. The Internet and Higher Education, 25, 45-52.
https://doi.org/10.1016/j.iheduc.2014.12.004
[28] Zimmerman, T. D., & Nimon, K. (2017). The Online Student Connectedness Survey: Evidence of Initial Construct Validity. The International Review of Research in Open and Distributed Learning, 18, 25-46.
https://doi.org/10.19173/irrodl.v18i3.2484

Copyright © 2024 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.