Week 11: Analogue Learning Analytics (Discussion) (Part 2)
As seen in table 1, the time I (approximately) spent on the various activities over four days was:
- 1 hour on Moodle;
- 5 hours on MyEd;
- 75 hours on sites that were not specifically academic (e.g. Google, Youtube);
- 4 hours on Open Notes (PDF reader); and
- 75 hours on WordPress (blog).
The data provides a few interesting insights.
If a teacher were to use analytics provided by Moodle, he or she will only see that I only spent one hour on Moodle. This could lead him or her to infer that I was largely “inactive” in my studies. However, the reality is that this is only about 8% of my study time over the four days. Despite being the LMS site, and arguably the main site, it was where I spent the least amount of time. Moreover, I mainly used it for administrative and not learning purposes.
It may be difficult for learning analytics to provide accurate figures for different online learning activities because they can overlap. I spent roughly the same time reading (4 hours) and writing (3.75 hours) but these activities were not done in isolation. Basically, there were times when I read while I was writing and, other times where I wrote as I read.
Similarly, it was difficult to precisely track how long I used each platform or application. At times, I was multi-tasking between two platforms. I was also faced distractions when I was using Youtube.
Lastly, and cynically, my study style meant that my tutor could have dispensed with using Moodle and simply sent me a pdf file with instructions and a reading list. If he or she had sent me the actual articles or a link to a Dropbox folder, my time spent on MyEd and search engines would also be greatly reduced.
Recommendations & limitations
Teachers could make LMS-es a bigger part of a student’s learning and in return receive more accurate readings. Aside from the use of discussion forums, texts could be read by the student on the LMS site itself. This could also help students better catalogue their readings. This would, however, make the learning less “personalized”, which Wilson, Watson, Thompson, Drew and Doyle (2017) notes is one of persuasive arguments for learning analytics (p. 993).
In order to collect more accurate data on learning activities, learning analytics may need to become more pervasive. This informal study highlights the limitations of self-reporting from students, but this can only mean that technology would be used to collect this data instead. Also, in hindsight I could have thought of different ways to document or track each activity instead of relying on my phone’s timer for all of them. This limitation supports Wilson et al. (2017) argument against using “one generic approach” to suit all contexts.
I definitely had the luxury of personalizing my own schedule. I could take short breaks, by watching music videos on Youtube, when I felt tired. However, I was aware that I was tracking my own activities, so I did not want to have to report being distracted for too long. I could see how students might be wary of learning analytics because “Big Brother” is always watching. Online learning gives students the autonomy to decide their own learning approach. Taking breaks here and there might be helpful for students, but an institution that views learning from an industrialist lens might think otherwise. In this view, breaks would be viewed unproductive. Critics of learning analytics have called for teachers to be more greatly involved, but we should also include the perspective of students when determining how to fruitfully and fairly utilize learning analytics for learning.
References
Wilson, A., Watson, C., Thompson, T. L., Drew, V., & Doyle, S. 2017. Learning analytics: challenges and limitations. Teaching in Higher Education. 22(8). pp. 991-1007
Recent comments