Estimated reading time: 3 minutes, 52 seconds
If you don’t already know me, I’m not a fan of learning analytics for several reasons (and I am a huge fan of Paul Prinsloo’s post on decolonizing learning analytics). I was part of the recent NMC Horizon report expert panel, and analytics technologies are among the short-term trends expected to be adopted in higher education (see the report). What I see most often is talk of using LMS learning analytics to identify learners at risk in order to enhance retention numbers. For me, this falls so short of what education should be about. I understand that having large numbers of online students makes it difficult for an instructor to give individual attention to each one (which, you know, is why it’s good practice to NOT ask one instructor to handle large numbers of online students, and give them a supporting instructor or several strong TAs… like, a human solution, but anyway!) but I have strong objections to the premise upon which learning analytics are built.
First of all, learning analytics focus on observable and quantifiable behaviors that are easy for the LMS to collect. When someone logged in, how long they stayed, which tool they used. We don’t know for sure that if analytics tell me someone watched the same video 4 times… if this means they literally sat and watched it 4 times from beginning to end (though some systems do tell how many mins were watched) but we don’t know if they tried again because their internet connection was bad, or because they were distracted by something happening at home, or something else they were doing on their computer at the same time… we just don’t know. We can know that someone has not submitted their last two assignments. But we don’t know why they didn’t submit them, what kind of barriers they were facing, and how to motivate them in future? Providing some kind of reminder in order to catch a student before they fail is helpful and all, but someone probably needs to probe deeper as to what is behind the behavior the learning analytics flag as problematic. And some kind of agency and autonomy should be afforded to the students, so that they, for example, learn to build a time-management system to help them determine when something is due and remind themselves when to start working on it before it’s due, not on the due date. For example, I benefit a lot from email reminders to submit peer reviews for articles I’m working on, but I also put reminders on my calendar (I’m not the best person at managing my time with all my required tasks, but I’m not the worst – but they key thing is that I develop my own system for managing much of what I do, because relying on others to remind me is a crutch).
Learning analytics give us data on student behaviors, but they do not provide explanations for these behaviors, so they do not tell the whole story of the human being, what motivates them, what barriers/obstacles stand in their way. They do not get to the root of lack of engagement.
Also, if what happens once a student is “identified” as needing help, is automated, this makes me extremely sad. Because someone who is falling behind probably needs personal attention and care from the teacher or at least a TA. I hope that most people who use learning analytics use them that way. Offering two-way care, rather than an email or message reminder or threat. This JISC report mentions how analytics data are given to both tutors and students themselves, and that simply being aware of risks helps improve student outcomes. It even says students don’t usually need complicated dashboards, but just knowing something needs to be done. I find that a bit strange, but perhaps these students know where to go for help/support and therefore get the kind of support they need once they realize something is off. Perhaps.
Lots of talk about learning analytics does not mention the reductionism of what data gets collected. Only visible behaviors can be collected by learning analytics. This would be like assuming that only students who spoke in class were there… (though we know in a classroom, what is visible is a lot more – and students speaking is not visible, it is auditory. Bad analogy maybe).
I know there is awareness of ethical issues related to learning analytics and their collection of student data and preserving their privacy. But this just assumes learning analytics are “useful, but…” and I’m wondering why there isn’t more probing about their limitations in the first place.