The need for student-centered learning analytics (SCLA)

Learning analytics is an emerging technology that uses data science methods to generate actionable insights about learning (Siemens 2012; Leth Jørnø and Gynther 2018). Distinct from traditional educational research that first generates generalizable knowledge and then uses it to improve the experience of future learners, learning analytics generate new knowledge while simultaneously using it to inform current learning practices and learners (Clow 2012). This offers great potential for impact in the context of the COVID-19 pandemic where traditional practices have been upended and we need to quickly develop new understandings while simultaneously attending to the pressing needs of students and teachers living through this digital shift. In particular, learning analytics tools can help fill a critical information gap as the sudden move to online learning eliminates many classroom-based cues they have typically relied on. Learning analytics also offer powerful support for students inexperienced with online learning who face challenges in self-regulation, including planning their learning pace, monitoring their learning comprehension, and making judgments about their learning process (Song and Hill 2007).

As interest in leveraging the large amounts of data generated by students online intensifies, it is important that learning analytics are adopted in effective and ethical ways. Ifenthaler and Schumacher (2016) provide valuable guidance towards this in their investigation of 330 university students’ perceptions of learning analytics’ usefulness and privacy. They report two seemingly contradictory results: students were most interested in analytics that provided a variety of detailed support that could be relevant at different points in time (e.g. recommendations of materials to read, predictions of their mastery of course material); but they were most willing to share data with systems that provided only general information about their activity and performance (leaving reflection on and response to the data up to them). This conflict is solved by a third finding: acceptance of learning analytics was related to student control over data. Together these results align with the findings of Prinsloo and Slade (2014) that students are not content with the passive role they are typically given as recipients of information about themselves nor with the lack of transparency about the processes used to generate it. As was made clear by Ifenthaler and Schumacher, for learning analytics to be accepted and effective for students in any widespread way, they must be centered as agents of their own learning. But one limitation of Ifenthaler and Schumacher (2016) is that it stops short of addressing the implications of a student-centered orientation for the creation and adoption of learning analytics tools. Drawing on a handful of recent explorations in the learning analytics field and related work in other design-based arenas, we identify three shifts needed to implement a vision of SCLA: (1) involve students in the creation of analytic tools meant to serve them; (2) develop analytics that are contextualized, explainable and configurable; and (3) empower students’ use of analytic tools as part of the larger process of learning. Below is a description of each of the needed shifts, together with a characterization of their current level of maturity and adoption.

Three shifts that can support SCLA and progress to date

The first shift needed brings students into the conceptualization and design of analytic tools created to serve them. Efforts in this direction are already underway as part of a field-wide move towards human-centered learning analytics (Buckingham Shum et al. 2019). Adopting and adapting participatory design methodologies from the field of human-computer interaction (Schuler and Namioka 1993), students are starting to be included not only in analyzing information needs, but also in ideating, revising and testing learning analytics concepts and prototypes (e.g. Prieto-Alvarez et al. 2020; Sarmiento et al. 2020). This involvement is necessary to create tools that students will find useful, usable, and, most importantly, acceptable in terms of privacy considerations. While early initiatives are promising, widespread adoption of participatory methods involving students in learning analytics design remains to occur.

The second shift needed moves away from analytic tools that “black box” their inner workings such that learners do not understand what they are shown, how that information was generated, or why it is useful (Kitto et al. 2017). Such information is necessary to generate student trust in and ownership of analytics, a challenge that has remained mostly unexplored in this field (Bodily and Verbert 2017), but has been the center of intense discussion and research in other areas of AI (Adadi and Berrada 2018; Arrieta et al. 2020). We refer to the new paradigm as Glassbox Analytics, characterized by tools which provide information that is contextualized, explainable and configurable. The drive towards contextualized analytics stems from a recognition that students’ needs vary during their learning process and different information becomes relevant over time to meet these changing needs. Contextualized analytics require finding out what questions students have about their learning during different phases of the experience and identifying information that can help answer those questions at each point in time. Explainable analytics make transparent how different metrics, visualizations and predictions came to be. This requires identifying what key choices in data selection, pre-processing, modelling and reporting can be helpful for students to understand analytics about their learning and how these can be communicated in a way that offers sufficient detail without being overwhelming. Finally, configurable analytics build on this understanding to offer students input into analytic decisions. Thus, rather than a tool that simply provides students with pre-determined information, students can work with the tool to make decisions about data selection, processing, modeling and reporting that are appropriate for them. Creation of analytic tools that effectively meet each of these criteria requires a combination of state-of-the-art computational approaches and the participatory design methods described earlier. This creates a high bar for translating ideas around Glassbox Analytics into practical applications, explaining their current scarcity.

The final shift needed addresses a growing research base showing that effective use of learning analytics requires more than simply providing a set of tools (van Leeuwen et al. in press). Specifically, students also need to understand how the information provided by the analytics relates to learning goals and desired patterns of activity, and require valid reference points with which to compare themselves (Wise et al. 2016). Furthermore, they need support in moving from the recognition that change is required to forming a plan of action to produce that change, as well as ways to track their progress over time (Wise and Vytasek 2017). While pedagogical support in contextualizing, interpreting and acting on the information has not historically been included as part of analytics implementations, recent work has begun to explore this design space through the use of interpretation guides (Jivet et al. 2020) embedded metacognitive prompts (Yilmaz and Yilmaz 2020) and workshops to orient students to the analytics (Knight et al. 2020).

Conclusion

Analytics can help students better understand how they engage in learning activities, how this engagement leads to particular learning outcomes, and to leverage this insight to successfully improve their learning experiences. However, Ifenthaler and Schumacher’s (2016) work, and ensuing conversations in the field, have made clear that a student-centered paradigm is needed to support learners in accepting analytics into their learning ecosystem. The three needed shifts described above provide high-level requirements to guide novel learning analytics implementations that empower students to be active analysts and agents of their own learning process. Putting learners (rather than researchers, instructors or machines) in the driver’s seat with respect to the use of their own data gives students a powerful tool and source of information to manage the increased self-regulatory demands of the current shift to digital.