Any views expressed within media held on this service are those of the contributors, should not be taken as approved or endorsed by the University, and do not necessarily reflect the views of the University in respect of any particular issue.

Educational Design and Engagement

Educational Design and Engagement

Enriching the student learning experience & supporting development of on campus and online courses.

Learning Analytics Summer Institute 7th July 2013

Learning Analytics has rapidly become a “hot topic” in Higher Education. Our teaching and  learning activities online now generate “big data” and there is a need to manage and understand that data, to make best use of it at national, local, institutional and school level.  All sorts of work is being done on developing this understanding, ensuring data quality, and developing models for further research. This year the Society for Learning Analytics Research held lots of local 1-day events accompanying their main conference in Stanford, and Edinburgh hosted the UK-based strand of the Learning Analytics Summer Institute.

This is a very personal “take” on this event, from someone with no expertise in this area – I’m one of those people who prefer to skip the tables of figures when I’m reading a paper, so I approached  this day of presentations and discussion with some trepidation.

I need not have worried – the self-proclaimed statistics geeks at this event provided overviews, current research, and top tips on visualisation tools in a transparent and entertaining way. I was especially grateful to Naomi Jeffrey for an incisive whistle-stop tour of current research.

I enjoyed  Clare Llewellyn’s overview of her research on using an argumentation framework to analyse data from discussion threads in conventional forums and on Twitter. This, and Duygu Simsek’s work using the Xerox Incremental Parser to apply algorithms identifying key points in academic papers, showed how the technology is maturing to a point where automated analysis can contribute something helpful to research on behaviours online. Nevertheless a healthy scepticism prevailed, especially in discussion sessions where issues of privacy, and “truth” in data analysis, came to the fore.

These points came up again in the morning discussion session and in the afternoon when Giles Carden illustrated some ways in which senior management use data visualisation to inform policy and strategic developments, while Chris Ballard from Tribal outlined the approach his company takes to predictive learning analytics.


Martin Hawksley and Em Bailey in the spreadsheet play-offs.

In one of the most surprisingly exciting presentations of the day, Em Bailey made Excel spreadsheets attractive and powerful – although I missed the play-off between Excel and Google docs on one data-crunching session, see the #lasiuk Twitter feed for a blow-by-blow account. Amongst other uses, Em takes the data made available to universities through HESA returns and the NSA, and uses spreadsheets to provide powerful but comprehensible visualisations of comparisons across UK institutions and within her own institution.

There was ample time during the day for small and large group discussion and general networking.  It’s clear that while exploring the insights that access to this unprecedented amount of data might give us, we have to ensure that we understand the nature of the data and the nature of the analysis taking place. Participants voiced concerns not only about the ethical issues of privacy and data ownership, but about the interpretation of data and in particular its predictive powers.

Like many other institutions, Edinburgh is exploring the possibility of a “student dashboard” available for students and their personal tutors, which would provide information about progress within a cohort and alert them to indicators of problems. If we are to expose staff and students to visualisations of data in this way we need to be very sure that we have equipped them with an appropriate understanding. Especially when dealing with the predictive aspects of analytics, we need to unpick the difference between a “representative student” and a real one. The “typical” student used in any visualisation of a data set has no reality except as a set of trends, and the real individual we deal with should understand that, and that they can always “buck the trend”. The emotional components of providing students with feedback on associated performance data was also discussed.

So: my “take-away” from the day is mainly a resolve to properly get to grips with understanding data visualisations, in particular the ways in which information from our VLE and other learning technology systems can be used or abused.

Further reading

For more substantial blog posts, consider Martin Hawksey’s liveblog of the event and Sheila MacNeill’s always-informative posts.

CETIS publications on Learning Analytics

EDUCAUSE publications and research links on analytics and visualisations

A discussion of Cleveland’s Graphical Features Hierarchy


Leave a reply

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

This site uses Akismet to reduce spam. Learn how your comment data is processed.


Report this page

To report inappropriate content on this page, please use the form below. Upon receiving your report, we will be in touch as per the Take Down Policy of the service.

Please note that personal data collected through this form is used and stored for the purposes of processing this report and communication with you.

If you are unable to report a concern about content via this form please contact the Service Owner.

Please enter an email address you wish to be contacted on. Please describe the unacceptable content in sufficient detail to allow us to locate it, and why you consider it to be unacceptable.
By submitting this report, you accept that it is accurate and that fraudulent or nuisance complaints may result in action by the University.