Tailored to Fit: Reflections on learning analytics sessions at Connect More 2025

Summary
Reflections on two practical learning analytics sessions at the JISC ConnectMore25, specifically - one from Charlotte Whitaker at Imperial College London on how they’re co-creating a learning analytics dashboard with their staff and students, and another by Ethan Henry on how the implementation of a major learning analytics project focussed on student wellbeing has been going at City University (now City St. George’s). My main takeaways from the sessions are that to be effective and have impact, any dashboards and data used for learning analytics need to be co-created, layered and bespoke for the needs of a particular institution and its varied learning analytics users.
I recently had the luxury of attending a few learning analytics-related sessions at JISC’s online conference, ConnectMore25.
The most useful sessions to me were grounded in real world case studies, one from Charlotte Whitaker at Imperial College London on how they’re co-creating a learning analytics dashboard with their staff and students, and another by Ethan Henry on how the implementation of a major learning analytics project focussed on student wellbeing has been going at City University (now City St. George’s).
Learning analytics dashboards can be a tricky subject. Anyone who has noticed the various locations of our own learning analytics reports and data at Edinburgh has probably had the thought — why haven’t we got a dashboard where it’s all visible in one place?
Well, the fact is, to be useful, dashboards have to be pretty bespoke to meet the needs of the stakeholders. And to develop that kind of tailored dashboard, either in-house or working with an external provider, costs a lot in terms of money and resources.
And, as yet, we have no good evidence that dashboards are delivering what we hoped in terms of improving academic performance. (For more on this, see the Learning Analytics & Knowledge Conference 2024 paper of the year.
Have Learning Analytics Dashboards Lived Up to the Hype? (login required)
The mind boggles at the thought of trying to create a learning analytics dashboard that would suit the needs of around 60,000 Edinburgh students, teaching staff, administrative staff, and student support staff across our diverse and devolved Colleges, Schools and programmes, both in-person and online, who all have their own needs, wants and well-thought-out approaches to learning and teaching their particular courses
I think this need for a tailored approach is why Imperial started co-creating their dashboard with stakeholders involved right from the start of their learning analytics project. Asking what data people needed for their role and how they were using it allowed Imperial to focus on the most useful key data points from the start. Importantly, they also asked students how they wanted their learning analytics to be used by staff.
According to Whitaker, they still then had the challenge of layering the dashboard in different ways for the different roles and their needs, not to mention the data they should have access to and the data they shouldn’t.
On top of that, they found that some stakeholders just wanted the simplest, quickest summary of the data (the quick scan persona), while others wanted all the details (the deep dive persona). And feedback said different people wanted to see the dashboard and its visualisations in different ways. So, they designed with more layers.
What this means is that it’s a massive ongoing design project, with the dashboard available for some roles and still in development for others, with feedback and co-creation ongoing. And they’re just getting started with designing the student dashboard, which will kick off with a hackathon later this year.
Imperial is also now taking on the task of starting to measure the impact. How will they know it’s working?
Measuring the impact of any learning analytics programme or dashboard is challenging, particularly in large institutions. You’re essentially looking for what happened (interventions and positive impacts on students) and what didn’t happen (how many students didn’t fail or withdraw because of these interventions).
That’s where City St. George’s talk by Ethan Henry had some interesting information about the impact of their LEAP learning analytics project.
So far, their project has prioritised student wellbeing and started with the task of getting essential learning analytics data (attendance and engagement) to people like their student wellbeing officers on a weekly basis using a combination of a dashboard and Excel spreadsheets. Those people then decide, based on a set of bespoke, locally-determined thresholds, when and how to offer support or non-punitive interventions.
The interesting thing is, it seems to be working.
Even with just a small segment of staff using the learning analytics data in a structured way, they’ve been able to gather data showing a strong correlation between the data-informed targeted interventions/support and a significant reduction in non-engagement and poor attendance. They’ve correlated increased engagement and attendance with a higher chance of passing a course and a lower likelihood of withdrawing from the university.
Basically, there seem to be more effective interventions being made, and it’s having a positive impact on their students.
A learning analytics dashboard has been part of the approach, but their success is, to my mind, down to the human element, a bespoke approach on a course-by-course basis, and their use of other tools alongside the dashboard.
The takeaways for me are that a dashboard on its own isn’t any kind of a solution, and that if we ever did consider playing around with creating an Edinburgh learning analytics dashboard, the complexity of our University would require the same deep level of co-creation with stakeholders, layered and bespoke approach, and focus on just the data that we know has the most impact.
(Person teaching university students by Yan Krukau and Graphic with Trendlink by champc, both free via Canva, CCO)