Estimated reading time: 3 minutes, 52 seconds

Tell Me, Learning Analytics…

Estimated reading time: 3 minutes, 52 seconds

If you don’t already know me, I’m not a fan of learning analytics for several reasons (and I am a huge fan of Paul Prinsloo’s post on decolonizing learning analytics). I was part of the recent NMC Horizon report expert panel, and analytics technologies are among the short-term trends expected to be adopted in higher education (see the report). What I see most often is talk of using LMS learning analytics to identify learners at risk in order to enhance retention numbers. For me, this falls so short of what education should be about. I understand that having large numbers of online students makes it difficult for an instructor to give individual attention to each one (which, you know, is why it’s good practice to NOT ask one instructor to handle large numbers of online students, and give them a supporting instructor or several strong TAs… like, a human solution, but anyway!) but I have strong objections to the premise upon which learning analytics are built.

First of all, learning analytics focus on observable and quantifiable behaviors that are easy for the LMS to collect. When someone logged in, how long they stayed, which tool they used. We don’t know for sure that if analytics tell me someone watched the same video 4 times… if this means they literally sat and watched it 4 times from beginning to end (though some systems do tell how many mins were watched) but we don’t know if they tried again because their internet connection was bad, or because they were distracted by something happening at home, or something else they were doing on their computer at the same time… we just don’t know. We can know that someone has not submitted their last two assignments. But we don’t know why they didn’t submit them, what kind of barriers they were facing, and how to motivate them in future? Providing some kind of reminder in order to catch a student before they fail is helpful and all, but someone probably needs to probe deeper as to what is behind the behavior the learning analytics flag as problematic. And some kind of agency and autonomy should be afforded to the students, so that they, for example, learn to build a time-management system to help them determine when something is due and remind themselves when to start working on it before it’s due, not on the due date. For example, I benefit a lot from email reminders to submit peer reviews for articles I’m working on, but I also put reminders on my calendar (I’m not the best person at managing my time with all my required tasks, but I’m not the worst – but they key thing is that I develop my own system for managing much of what I do, because relying on others to remind me is a crutch).

Learning analytics give us data on student behaviors, but they do not provide explanations for these behaviors, so they do not tell the whole story of the human being, what motivates them, what barriers/obstacles stand in their way. They do not get to the root of lack of engagement.

Also, if what happens once a student is “identified” as needing help, is automated, this makes me extremely sad. Because someone who is falling behind probably needs personal attention and care from the teacher or at least a TA. I hope that most people who use learning analytics use them that way. Offering two-way care, rather than an email or message reminder or threat. This JISC report mentions how analytics data are given to both tutors and students themselves, and that simply being aware of risks helps improve student outcomes. It even says students don’t usually need complicated dashboards, but just knowing something needs to be done. I find that a bit strange, but perhaps these students know where to go for help/support and therefore get the kind of support they need once they realize something is off. Perhaps.

Lots of talk about learning analytics does not mention the reductionism of what data gets collected. Only visible behaviors can be collected by learning analytics. This would be like assuming that only students who spoke in class were there… (though we know in a classroom, what is visible is a lot more – and students speaking is not visible, it is auditory. Bad analogy maybe).

I know there is awareness of ethical issues related to learning analytics and their collection of student data and preserving their privacy. But this just assumes learning analytics are “useful, but…” and I’m wondering why there isn’t more probing about their limitations in the first place.

3 thoughts on “Tell Me, Learning Analytics…

  1. Basically, there is a huge lack of theory as well as historical educational research knowledge in the field. At the recent LAK conference in Arizona, Chris Rice, Tanya Joosten, and myself were constantly on Twitter subtweeting the conference away from the official hashtag about this problem. Very basic educational ideas were being presented as if they were just discovered – things along the lines of “students that read more content scored better on the tests.” But no theoretical examination of what that meant, no connection to past research around that area, no critical examination of how limiting that finding is in the first place.

    And this is the top Learning Analytics conference, boasting an incredibly low acceptance rate. Some UTA profs were there expressing their amazement at that: if these are the ones that got in…. what got rejected?

    Not to mention that the proceedings from this conference now rank in the top 10 EdTech journals in Google’s rankings. But you read the papers they publish in the proceedings, and a huge chunk of them would not be considered for any of the other top 10 journals because they are soooo lacking in the theory, in critical examination in the discussion, etc.

    I hate to sound so negative about it… but I mentioned these same problems at the same conference 3 years ago and basically got “shooshed” over it by some people. Even those that somewhat agreed didn’t really get it. Now more and more people with higher visibility than me are starting to notice as well. If they don’t hit the breaks and deal with this now, it could blow up on them later. Can you imagine the expose waiting to happen on a top 10 ranked publication being found to have multiple articles that wouldn’t even get a peer review in the other top 10?

    1. Wowww. I always wondered if there was good work on this and I was missing it. I did read stuff by Prinslop/Slater and Gasevic/Siemens but it’s old stuff about ethics. But it’s not merely a question of ethics, right? We need to question the value of all this in the first place!! Why do it at all? Thanks for letting me know I am not crazy here

    2. Btw this blogpost was triggered by me peer-reviewing an article that heavily relied on learning analytics – in a non-ed tech journal and I raised many concerns. I’ll see how it goes!

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.