Evaluating the efficiency of social learning networks: Perspectives for harnessing learning analytics to improve discussions
Introduction
Social learning networks are ubiquitous in our contemporary hyper-connected reality. They are built into our online networks, as forums, faqs, mailing lists, news feeds and social feeds, etc. The internet has been celebrated as a means to leverage the power of the crowd and cheap processing power to bring mass education online by facilitating access and dissemination of information. The promise is to empower human ingenuity through collective creation. MOOCs represent the course model with the largest reach worldwide. MOOCs follow a format that includes video lecture, worked problems and quizzes, and the occasional case study and project assignments. However, some MOOCs do follow alternate designs and successfully incorporate discussion and collaborative learning.
MOOCs are offered regularly for credit but often the materials are made freely available for self-paced study as well. Thus, they can be considered multi-modal social learning networks. There are two primary modes to consume the course, actively, in interaction with others, or passively, on one's own time. Indeed, “passive” online learning has received less attention though statistics demonstrate that most users consume the internet passively rather than actively contribute to content creation; a phenomenon known as participation inequality (Nielsen, 2006). Yet “passive” does not mean that learning does not occur. In fact, passive learning is too general a concept to be useful, it masks other distinctions, concerning for instance, the goal directedness of the behavior. One form of online learning that can be considered “passive” concerns the use of knowledge sharing forums and question and answer sites such as Stack Overflow in the programming community. Whether one actively contributes through forming questions and posting answers, a user can still extract a learning benefit from having their own personal inquiries satisfied by searching and reading posts. Thus, a social learning network affords multiple learning modes, as producer and consumer of knowledge.
Perhaps more than the transmission of knowledge, the potential of social learning networks lies in its powerful extension of vicarious learning (Bandura, 1986) to global dimensions. Through online learning, we can learn by pooling our experience together and collectively build on earlier successes through the co-construction of common knowledge (Vygotsky, 1978). Discussion forums are constructivist educational technology (Lemay & Doleck, 2020a), or technology forms and informs learning as the medium that creates the affordances for learning. Yet the use of live, seminar-style discussion forums remains limited (Dennen & Wieland, 2007) perhaps because they require a good deal of planning to successfully orchestrate and are generally short-lived and contextually-bound (Sun & Chen, 2016). As a digital artifact, a discussion forum can continue to inform a social learning network.
Research to help improve discussion forums online can help contribute to the success of MOOCs and empower social learning network effects. Measuring social learning network efficiency would provide feedback to optimize knowledge sharing and cultivate vicarious online learning. Researchers have begun to analyze social learning networks using tools developed in engineering fields for network and signal analysis. Brinton et al. (2018) have proposed an algorithm for evaluating and optimizing social learning networks. In the present study, we implement their algorithm and use it to evaluate the efficiency of a MOOC developed using a video lecture and worked problem instructional format.
We asked “Can Brinton et al.'s (2018) algorithm for measuring social learning efficiency be used to evaluate the efficiency of a social learning network manifest in a MOOC employing video lectures and worked problem assignments?”
We shall briefly review social learning theory and discuss related studies of discussion forums in online learning before presenting Brinton et al.'s (2018) social learning network optimization algorithm.
Section snippets
Social learning
Social learning theories (Bandura, 1977; Wenger, 1999) emphasize the role of collective behavior, especially in regards to learning in a social context. In social learning, knowledge is constructed and shared among members of the group. Indeed, one of the key requirements for a process to be considered social learning is that it must occur through social interactions (Reed et al., 2010). The construct of social learning reflects the importance of discussion as a means of building and supporting
Research design
The study is confirmatory in nature as it seeks to assess the validity of a theoretical model of social learning to evaluate the efficiency of social learning networks as manifested in a MOOC developed for the EdX course platform.
Context
The data are taken from an EdX course offered in the summer of 2015 entitled Big Data and Education. The course had over 10,432 registered users but only 519 active forum users. Learners were distributed globally and had a range of backgrounds. Many were professionals
Weighted adjacency network
Of the 10,432 registered users, only 519 were actively engaged in the forum. These numbers suggest that this MOOC's discussion forum did not serve an important role in student learning. Thus, the social learning network was not expected to be very efficient in this particular iteration of the MOOC format.
Topic inference
Topics were inferred using latent Dirichlet allocation (LDA; Blei et al., 2003) to estimate the probability of topic associations based on patterns of word occurrences. From the LDA analysis,
Discussion
With an average learning benefit of 1.48, the social learning network was manifestly inefficient prior to optimization. However, even following optimization, at only 20.73, the network would still be inefficient in its potential learning gain. How do we interpret these results? It is important to remember that this network only comprised 1/20 of the overall course registration and the topics were heavily weighted toward administrative topics like personal introductions and course information
Declaration of competing interest
None.
Acknowledgments
We wish to thank Dr. Ryan Baker and his Learning Analytics Lab for this collaboration and providing the performance data from his Big Data and Education MOOC from the University of Pennsylvania. David John Lemay was awarded a postdoctoral fellowship (56-2019-0651) from the Social Sciences and Humanities Research Council, Canada.
References (70)
- et al.
A history of graph entropy measures
Information Sciences
(2011) - et al.
Communication patterns in massively open online courses
The Internet and Higher Education
(2014) - et al.
Collaborative learning in asynchronous discussion groups: What about the impact on cognitive processing?
Computers in Human Behavior
(2005) - et al.
What the discourse tells us: Talk and indicators of high-level comprehension
International Journal of Educational Research
(2008) - et al.
Understanding lurkers in online communities: A literature review
Computers in Human Behavior
(2014) - et al.
Who are the top contributors in a MOOC? Relating participants’ performance and contributions
Journal of Computer Assisted Learning
(2016) - et al.
Systematic review of discussion forums in massive open online courses (MOOCs)
IEEE Transactions on Learning Technologies
(2019) Curricular alignment: A Re-examination
Theory Into Practice
(2002)- et al.
Engaging with massive online courses
Asynchronous discussion forums: Success factors, outcomes, assessments, and limitations
Educational Technology & Society
(2009)