1Recent technological developments have allowed Learning Analytics (LA) researchers to capture the digital traces of the learning activities of students in Virtual Learning Environments (VLEs). This rich and fine-grained data about actual learner behaviours are claimed to offer educators potentially valuable insights into how students react to different learning designs and how ‘at-risk’ students could be supported to complete their studies. Learning Analytics was defined back in 2011 for the first LAK conference as: ‘the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs’1. The JISC Learning Analytics in Higher Education report (2016) has identified four areas where the use of LA could make a significant contribution in Higher Education institutions:
-
A tool for quality assurance and quality improvement: teaching staff using data to improve their own practice.
-
A tool for boosting retention rates: institutions using data to identify ‘at risk students’ and designing interventions at an early stage.
-
As a tool for assessing and acting upon differential outcomes among the student population: data being used to identify and closely monitor individuals from underperforming groups to improve attainment (for example, using LA to ensure that decision-making on supporting BME (black and minority ethnic) students is evidence-based.
-
An enabler for the development and introduction of adaptive learning: students are directed to learning materials, on the basis of their previous interactions with, and understanding of content and activities.
2In my role as lecturer and course designer at the Open University (OU), UK, I have been involved with Learning Analytics primarily through my work to improve retention and completion rates (area 2) for a number of years. Research and practice in LA in higher education have left more questions than have given answers on their impact on online and distance education. The current debate in DMS-DMK orchestrated by Daniel Peraya is addressing a number of these questions. This paper – written from the perspective of a distance language educator using Learning Analytics, not that of a researcher in Learning Analytics as such – contributes some reflections regarding the validity of some of the current practices with LA to detect distance learners at risk of not completing their studies.
3The Open University, UK has over 168,000 students studying part-time degrees, postgraduate and sub-degree qualifications, such as certificates or diplomas. Due to funding and regulatory changes in England, retention on a qualification has become a strategic issue for the institution. Retaining students is particularly demanding when they are geographically scattered, studying part-time and typically taking several years to obtain a degree. The institution sees the potential of Learning Analytics to address this complex issue and is investing heavily in a strategic Learning Analytics programme to enhance student success by embedding evidence-based decision-making at all levels. By developing its institutional capabilities in key areas, the Open University can strengthen the foundations for an effective deployment of LA. At the macro level, Learning Analytics are used to inform strategic priorities to continually enhance the student experience, retention and progression. At the micro level, Learning Analytics are used to drive short, medium and long-term interventions at the student, module and qualification levels. One of the University’s aims is to develop an ‘analytics mind-set’ throughout the university so that staff incorporate evidence-based decision-making into their day-to-day work. The following section presents two examples of the LA mind-set at the Open University.
4At the curriculum level, faculty staff use analytics to inform updates to the learning design and assessment of modules. Academic staff, in each faculty across the university, are encouraged to use dashboards (Rientes al., 2018) to continually watch closely how students behave on their modules with regards to, for example, usage of content and activities on the VLE, participation in module-wide and tutorial group forums, engagement in induction activities and submission of tutor-marked assignments (TMA). The latter is particularly significant as it constitutes the principal indicator of students’ progress. After each TMA submission date, academic teams receive data in form of statistics, indicating the level of student engagement with the module. Based on these statistics, series of interventions are taken by student support teams, tutors and academic teams, for example, sending reminders to students who have not submitted. TMA submission rates data are compiled over the course of a module and if a TMA is being identified as ‘problematic’ (because of a particularly low submission rate), academic teams may have to review the TMA task(s) with the view to improve submission rates. Such statistics are good to help identify trends at the level of a whole module or qualification cohort, due to the volume of data they represent. However, they are problematic at the level of individuals, and therefore do not help educators understand change and individuals’ traits and that is the reason why academics are often doubtful at adapting and re-designing assessment based on statistical analysis alone.
5OUAnalyse (OUA) is a Predictive Learning Analytics (PLA) system designed at the Open University that uses a range of advanced statistical and machine learning algorithms to identify students at risk of not completing their studies so that ‘cost-effective interventions’ can be made. Students normally have to submit between four and six assignments per module. Combined with predictions about whether a student will submit their next TMA, the system also provides information about whether students are likely to complete a course. This is the ‘overall’ prediction about a student’s performance. OUA was designed as a tool that would inform tutors about their students’ behaviour and motivate them towards taking actions when students are at risk of not submitting their next assignment. The broader objective was to increase students’ retention and completion of their studies. A number of OU studies (Herodotou et al., 2019) report that the more teachers make use of OUA data and the more successful students were in previous courses, the more likely students are to complete and pass a course. However, the studies also find that many teachers seemed reluctant to frequently engage with the system on a weekly basis. Authors believe that their findings are significant as they are starting to shed light on the rather limited understanding of how teachers’ usage of LA may relate to students’ performance. They also warn that any causal interpretation of the findings should be treated with caution, as they point out that additional variables, not captured in the regression analysis may have had an influence or explain student learning outcomes. Overall, in my faculty, the attitude towards OUA is comparable to the findings of Herodotou et al. (2019). The tutors in language modules hardly use the PLA system because they have not been trained to do so but most importantly they rely on their own systems of monitoring their students’ progress and do not see the advantage of using what they consider to be a complicated and time-consuming tool. In my faculty, we normally use OUA for new modules when we need to monitor the behaviour of students in relation to new assessment. We have not noticed yet an improvement of retention in those modules directly linked to the use of OUA. As several interventions are operating on new modules it is difficult to evaluate which intervention has the most impact on retention.
6The reflections below on my personal experience with analytics addresses some of the questions raised by Daniel Peraya (2019a and 2019b) and ties up with some of the challenges discussed in previous papers on this topic.
7From an academic point of view, Learning Analytics are regarded as being more concerned with aspects of institutional business such as recruitment and financial stability, and less related to learning and pedagogy. Academic staff like myself often feel that higher education institutions in the UK are increasingly managed like businesses and that decisions are more and more frequently financially driven. Reports commissioned by various educational bodies conveniently indicate that Learning Analytics will help develop a more student-focused higher education provision and provide data and tools that institutions will be able to use for continuous improvement, for the benefit of students. Decision makers in higher education hope that continuous investment in Learning Analytics will lead to better outcomes for students, universities and the wider society. In my experience, Learning Analytics is used to make decisions to alter learning design and to send interventions with the view to improve retention, therefore securing financial stability. However, several potential issues with regards to LA for retention can be identified and are highlighted in the following sub-sections.
8The idea that measurement of students’ participation through VLE access, submission of assessments and other data can be used as a proxy for learning and hence likely academic success is one of the key concepts of Predictive Learning Analytics (Heredotou et al., 2019). Overall, studies on Learning Analytics confirm the assumption that the students who participate more are likely to perform better. Similarly, there is evidence that confident and successful students tend to use the learning tools available to them to a greater extent than struggling students (Salmon, 2003). It seems therefore that academics are led to make decisions on learning design or assessment based on the behaviours of students who display some engagement with their studies in the first place. Yet, retention in distance learning is often a problem specific to students who do not participate in induction activities and do not engage with the VLE and other online course materials. Therefore, massive amount of data generated through Learning Analytics collecting information from students who engage with the course are unlikely to help with identifying the reasons why certain students disengage from their studies, and what institutions can do to support them. A low TMA submission rate does not necessarily mean that the TMA is badly designed, it does indicate that a high percentage of students have failed to engage with attempting the TMA, the statistical data generated by Learning Analytics does not provide the reasons why a high percentage of students have failed to submit the TMA. In this case, analytics provide data that are difficult to interpret because they are taken out of context, and therefore not valid scientifically.
9I therefore concur with Luengo (Peraya and Luengo, 2019) who argues in favour of collecting thick data (les données épaisses) which contain evidence of activities related to learning and usage of materials in context. From an academic point of view, making pedagogical decisions and altering assessment for example, on the basis that a high percentage of students have not submitted a particular assignment is not pedagogically sound. Qualitative data are needed to support and complement the quantitative data. I believe that there is a need for more inductive research to create meaningful educational interventions to improve retention. This viewpoint also ties in with the sociocultural approach to LA recommended by Ferguson and Buckingham Shum (2012).
10Learning Analytics for distance learning can be a useful and powerful tool to make up for signs and clues that we cannot see but that face-to-face educators do have an opportunity to see. However, as discussed earlier, large datasets are unlikely to provide useful information for every type of student. The analysis required is complex and end users do not always have the digital skills or software knowledge to undertake the analysis (Buckingham Shum et al., 2019). Generally, in my experience, data are compiled and presented by data wranglers who have already processed the raw data to make it more legible, nevertheless it is often complicated to understand a ‘second hand’ dataset that has partially been interpreted. The researcher or the end user need to have confidence in the data presented to them to be able to make their own interpretations and decision-making for interventions. There is often potential for misinterpretations, a lack of coherence in the sheer variety of data sources, overly complex systems and information overload. The interpretive dimension of LA analysis as a potential challenge is covered in Pierrot’s (2019) paper that clearly stresses that interpretation transforms data into knowledge, where knowledge results from a co-construction of interpretations across stakeholders.
11Most reviews about the uptake of learning analytics and more specifically PLA (Ferguson and Clow, 2017; Rienties et al., 2016) claim that the actual uptake and integration of learning analytics in most institutions is rather limited. Researchers in LA and institutions raise the need to unpack how PLA is perceived by different stakeholders within HE, and to identify the factors that may encourage or prohibit wider adoption of PLA. Reviews and reports on the integration of learning analytics recommend that universities seek to understand the perceptions of involved stakeholders with the view to bring to light issues that potentially prevent the wider adoption of PLA. They particularly point to ways in which organisational culture is resistant or unwilling to change. Herodotou et al. (2019) for example, adopted the ‘technology acceptance model’ and the ‘academic resistance model’ as key conceptual frameworks for their study on how teachers use, interpret and integrate OUA in their teaching practices. They found that teachers hold cognitive beliefs about the way they should support distance learners. For example, teachers would like to receive data which are generally more sensitive to which materials or activities students are accessing, and what they do with them; and have a user-friendly system that would replace their own excel sheets to monitor students’ progress. The authors claim that teachers’ cognitive beliefs as well as concerns raised in relation to OUA functionality could be viewed as manifestations of teachers’ cognitive resistance to using the tool. The study also suggests that the lack of interest in engaging and adopting OUA may also relate to other factors such as limited time availability to engage with tasks that are outside of teachers’ contrast and maybe teachers’ lack of ‘trust’ to OUA data as opposed to directly receiving information from students via emails, phone calls or during tutorials. Finally, they see it as also related to teachers’ lack of competency in interpreting OUA insights and taking appropriate actions. The study found that there was discrepancy between a potential interest in using OUA in their teaching practices and teachers’ limited and rather infrequent use of it. Because they found that the more teachers make use of OUA data and the more successful students were in previous courses, the more likely students are to complete and pass a course, Herodotou et al. (2019) suggest that a clear and supportive management and professional development structure needs to be put in place to empower teachers to pro-actively help students flagged ‘at risk’. Invariably, studies conclude that, despite a lack of consensus, greater OUA usage is found to predict better completion and pass rates, suggesting that systematic engagement with OUA should become a significant aspect of the teaching practice as it can improve student performance. Herodotou et al. (2019) even recommend that OUA should be included in contractual agreements as a tool that can support students’ learning and inform teaching practices.
12The ’analytics mind-set’ is regarded as a top-down initiative and it seems to be relatively removed from the OU tutors who for reasons evoked above are not engaging with OUA yet. Some further thoughts on teachers’ engagement with LA follow in the conclusion.
13In distance education, teachers always have a support role and a responsibility for students’ completion as they are the human interaction of the institution. In that sense, at the Open University, for example, over the years tutors have always monitored their students’ behaviours and followed their progress designing their own systems of monitoring progress and tracking non-submission of TMA. OU teachers have knowledge of areas that students find problematic and can provide useful information to address retention. The OU would be wise to make use of the expert knowledge that tutors have about distance learners to complement the expertise of learning analysts and data wranglers to design and implement LA tools able to capture the most relevant data to support students to improve their performance.
14Involving the support staff and tutors in the design of the implementation is crucial. However, the impact on support staff and tutors of increased workload due to the additional use of an alert system needs to be considered, especially at busy times such as the induction period at the start of students’ studies and at the start of modules. Herodotou’s study revealed that many teachers seemed reluctant to frequently engage with the system on a weekly basis.
15The expertise of student-facing staff is important to ensure that interventions are appropriate and genuinely helpful to the students. At the OU, student support staff use learning analytics to inform proactive interventions across the cohort based on agreed criteria, in addition to reactive support and other existing initiatives.
16As Romero (2019) suggests, teachers must be involved in the process of LA, from the conception of methods for data collection to its analysis if the data collected through LA is to provide meaningful and relevant indications for teachers to be able to take the right course of action. Buckingham-Shaum et al. (2019) refer to human-centred Learning Analytics. This approach takes the form of a carefully designed strategy to engage educators across a university with LA, building their capacity through collegial training programmes that support action research. Their view, supported by faculties to my knowledge, is that it is more sensible to change the tools to suit the users, rather than changing the users to suit the tools. Non-LA experts are unlikely to be aware of the implications of different design choices, the potential of different analytic techniques, and constraints on implementation, and involving stakeholders may be perceived as difficult, time-consuming, and expensive. Nevertheless, involving them throughout the design process can make the difference between an unsuccessful project and a system that is taken up successfully, and is embedded in teaching practices.