Innovating Pedagogy 2017 – Student Led Analytics

I always find that whenever I start to research a topic for this website I suddenly feel like I am untangling a ball of wool. I see a tiny little bit sticking out of the ball and think ‘that will be long enough for a waffle’ and start to pull it away from the actually ball just to find that it gets longer and longer. Then I find that I encounter a knot which has to be unravelled and then have to spend several hours following the piece of wool under, over and round numerous other threads, trying desperately not to deviate from the other strand which was my initial topic to discuss. Staying with the Innovating Pedagogy Report 2017 I wanted to waffle about another topic – but I found that this one took me deep within the ball of wool and, in fact, as I write this, I am still trying to unravel it – what am I talking about? well this week it’s all about Student-Led Analytics.

I recently took part in a toolkit to calculate my digital capability ratings. Although this was a pilot – I think – I was really disappointed with the results. I consider myself a ‘technology geek’ and yet, after completing the numerous questions my final result was presented to me – 58% – yes you didn’t misread or hear that – I had two areas of my digital self at 58%. I was devastated! Percentages, numbers and analytics have the habit of doing this. They take all your experiences and aptitudes and put them all together in a number which can either then inspire or disappoint. In preparation for the post, I was reading an article in the Times Higher Education about a student who failed to abide by the attendance policy of a university. The title alone worried me, especially when it mentioned the word ‘entrap’. The article tells how analytics were used to impact negatively on the student’s time at the university and, although this is one side of the story it did get me thinking about the power of analytics and whether there was a difference between analytics and student-led analytics.

The Higher Education Commission in 2016, produced a paper aptly called ‘From Bricks to Clicks‘. Reading the Executive summary (reading the whole report would have involved unravelling even more of the ball of wool) the ‘major motivations for introducing learning analytics’ were stated as being ‘Increasing retention, Providing better feedback to the students, Capturing attendance data and Enhancing Teaching and Learning’. I must admit that from reading these that I was impressed by the initial motivational points. Many were focused on the student and their learning either directly or indirectly and that would definitely be a positive. Indeed this is further supported by an article in the IEEE Transaction on Learning Technologies, 2017 titled Towards Actionable Learning Analytics Using Dispositions where the authors encouraged us to;

Imagine a world in which learning activities of students can be effectively deployed to provide personalized intervention. Imagine a world in which educators both identify behavioral patterns of “at risk” students and uncover why those students are struggling in order to provide adaptive, individualized “actionable feedback”

From reading all this literature, despite my 58% in my capability report, I was becoming more and more encouraged by the world of analytics, whether this be student-led or learning in general. However, as always I started to consider the points and criticality raised its head. Engagement is something which I always struggle with – not my own engagement, although I guess that is true at times, but how to accurately measure it. I mentioned in my previous post about Spaced Learning how I use the emergence of mobile phones in my sessions as a cue that engagement has lapsed, usually because I have talked for too long. But does actually attending a session guarantee engagement. Within the session, even the quietest learner could be completely engaged with the session, thinking rather than contributing. Sometimes learners can be present but not even learning or thinking about the content of the session. The same could be said for clicks on links on the VLE. I was once accused in a training session of clicking my own website numerous times in order to get the ‘hits’ up. Although I was rather taken back by this, it does actually support my point. Clicking on a link does not demonstrate engagement, it demonstrates a click. If we are measuring ‘clicks’ and attendance what are we actually measuring and what is the impact of this data. I understand that lack of attendance or failure to engage with a click can be an indicator that something might be going a miss with a learner, and this data is important as early indications of students’ activity but we need to look a lot wider in order to gain more valuable data set and hence indicators. The Innovative Pedagogy Report itself comments that ‘…analytics make use of the data generated during study activity in order to enhance learning and teaching’. For me, the important phrase here is ‘during study activity’. This requires more than just attendance and/or clicks to be recorded. We need to look at all aspects of learning in order to gain a complete picture of what is happening and ultimately how we can support the learner. In April 2016, Jisc produced a report called ‘Learning Analytics in Higher Education‘. On page 19 of the report, I found something which I felt was a representation of the sort of data I would want to be collected to support learning. I’m not sure if I can reproduce it here so do go and have a look. The data collection involved Student information, VLE data and information from the library. The latter really interested me, because I would really like to know what the students are reading and engaging with. This would really enable me to support their learning either by redirecting them or even recommending more up to date resources. This data would also support library acquisitions as well. However there was one other type of data which was collected that really put the ‘cherry on the cake’.

If we are collecting data to support students’ learning then they must have an input into the system. The Jisc model has ‘self-declared data’ inputted into the system by the learners. This to me is perhaps one of the most important pieces of data. Yes, and I will say it before you all shout it at the screen, some might not be truly accurate with their data, or they might have to learn how to self assess themselves correctly, but data straight from the learners is something which would be truly valuable. All the data would then be available both on a staff dashboard – essential for academic tutor meetings so points for discussion could be raised – but also on a student app which they can see and engage with it as well. With all this in place then it would be possible to get a better indication of the holistic learner and, perhaps more importantly, how to support them whether it is to recommend different reading or to see why they are finding sessions difficult to attend.

Learning analytics have to always be viewed as a supportive tool. Data itself doesn’t provide the answers it only identifies what is happening. The interpretation of the data is the important factor and this is why there always needs to be a human within the system to engage with it and analyse what is happening and how to move things forward. Looking back on my 58% – yes I need to mention it again – I could spend time criticising the survey or the tool but really and truly I should be looking at the next steps to see how to improve. But I think the 58% demonstrates my point very well about learning analytics. Hopefully, if you knew me, you would not be giving me 58% for technology because you have engaged with the human element. Being a mathematician I know that numbers are great, but when it comes to analytics in my humble opinion, they are just numbers and should be treated as such. Behind them is a person/learner and one which I hope I will be able to support more effectively the better I know that person/learner and analytics just provides me with another layer of information which I need to interpret and engage with.

I am sure that many of my colleagues have their own views about learning analytics or my thoughts about it and, if you do, please sign up to the site and add them in the comment box below. Although they are worth nothing at all, you might even be rewarded with a waffling open badge! As always, all thoughts welcome and if you want to get an email when I next post then please remember you can subscribe to the site by adding your email in the box in the footer.

So until my next waffle, have fun and I’ll catch you later and, until then, consider yourself waffled! Oh, by the way –
your attendance to this blog post has been duly recorded…..