Any views expressed within media held on this service are those of the contributors, should not be taken as approved or endorsed by the University, and do not necessarily reflect the views of the University in respect of any particular issue.

The Case for Staff Training: LAP2025 takeaways and reflection

In June, the Society for Learning Analytics Research (SoLAR) hosted its 24-hour, round the world, all-online Learning Analytics in Practice 2025 conference.  

This is the second time they have run this event, and it’s one of my favourite learning analytics events because of its ambitious format and its practical and applied focus.  

The event runs for 24 hours straight and is hosted in successive locations around the world, starting in Australia and ending in North America.  While it can be a bit much trying to get your head around the schedule, they make recordings available. This format offers great flexibility for the times you attend, and you get some truly global perspectives in the presentations.  

As one would expect, the ‘menu’ for a 24-hour conference offered a range of topics, with a strong emphasis this year on AI for learning analytics and various types of learning analytics dashboards. 

I had the privilege of co-leading, with Stuart Nicol, our Head of eLearning Services, a discussion exploring what challenges institutions are finding when it comes to developing learning analytics training for staff and students. There were also two other sessions specifically about learning analytics training for staff in the USA and students in Ireland. 

It was interesting to see the students’ perspectives on their learning analytics from Dublin City University.  A primary student concern was making sure staff understand the importance of fairness, context, transparency and the data’s limitations when they’re using learning analytics data.  That argues a need for ensuring those areas are addressed in staff training.  

Something I didn’t expect that came out of the training-focussed sessions was related to dashboards, a topic that still managed to compete with AI for screen space at this event. (There were sessions about student dashboards, staff dashboards, subject area dashboards, course design dashboards and even some about using AI digital assistants with dashboard data to personalise learning.) 

What came up in discussions related to training is that for some institutions the focus is currently on developing dashboards that are so easy-to-use that training is not needed. Then they don’t need to use scarce resources on creating and running these trainings.   

I can see the logic given the current sector-wide focus on efficiency and potential downsizing.  

But the whole idea still made me pause.  

 

To my mind, you still need training, even with the most amazing dashboard in the world.  And to completely skip learning analytics training because you have a snazzy dashboard is pretty risky. 

If humans are doing the actual work, then we’re all better off if the humans know what they’re doing.  

 

Here are a few things humans can do that learning analytics dashboards can’t. 

 

1. Look at the context of the data

Often, the data is just a number counted by a machine. The machine can’t explore the data’s context, which is based, quite literally, in individual courses and individual students. But the context is essential to understanding what the data is (or is not) telling you. 

One of students’ more common concerns is about fairness — will staff just look at the numbers or will they take context into account?  Even the best of dashboards would struggle to do this.  

Practical training and guidance on considering context can help staff do this even better.  

 

2. Think critically about the data.   

How was it collected and measured? What are its limitations?  Does the label match what it’s actually measuring (a common issue with LMS data)? Does that metric actually answer the question being asked (relevance)? What other data points are needed to get a complete picture? Is there likely to be bias in that data? 

The dashboard is just a dashboard, and it can’t answer these critical questions. A human can…but they may need platform-specific information, training and guidance to do so. 

 

3. Consider pedagogical context and use.  

The data fed into the dashboard was generated within courses taking a certain pedagogical approach, whether that was flipped, collaborative, discussion and experience-based, enquiry-based or something else.  

Humans can look at the pedagogical context of the learning data, and a dashboard often can’t really take that into account unless that information has been included in its dataset.

With training and practice, a human can consider how the pedagogical approach might reflect in the data, whether the data is even valid for that approach, and what, if any, data-informed actions can be taken within that pedagogical approach. 

 

4. Consider the ethics.  

Ethically, a human needs to be involved in interpreting and applying learning analytics data, determining appropriate uses, and considering data protection, privacy and bias issues.  Dashboards can actually help with some of this, for example by aggregating and anonymising data where appropriate.  

But the ethics part of using learning analytics still falls squarely within human responsibilities. And most people need or want a little guidance and training to do that.  

 

To put it simply, a learning analytics dashboard is a tool.  A relatively new tool. Like other tools, it comes in different sizes, shapes, designs and quality levels. As with other tools, it’s often very possible to do the job without having the fanciest, most expensive tool.   

But the tool is not the craftsman. It is not the artisan. The dashboard is not the teacher or the student.    

I think having the tool does not remove the need for a human in the process. And if a human needs to use the tool, they need to at least be shown how and given the information to do the job well.  

 

Leave a reply

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

css.php

Report this page

To report inappropriate content on this page, please use the form below. Upon receiving your report, we will be in touch as per the Take Down Policy of the service.

Please note that personal data collected through this form is used and stored for the purposes of processing this report and communication with you.

If you are unable to report a concern about content via this form please contact the Service Owner.

Please enter an email address you wish to be contacted on. Please describe the unacceptable content in sufficient detail to allow us to locate it, and why you consider it to be unacceptable.
By submitting this report, you accept that it is accurate and that fraudulent or nuisance complaints may result in action by the University.

  Cancel