
Continuing the Spotlight on LOUISA (Learn Optimised for In-course Assessment) series, Emma Horrell describes how the LOUISA team applied User Experience (UX) research methods and techniques to prioritise ways to improve assessment and feedback, highlighting the value of this type of data-driven, partnership approach to change. Emma is the User Experience (UX) Manager within Information Services Group.
What is UX?
User Experience (UX) is defined as part of an ISO standard relating to ergonomics of human-system interaction and human-centred design for interactive systems. The standard describes ‘user experience’ as a ‘person’s perceptions and responses resulting from the use and/or anticipated use of a product, system or service’. Increasingly, UX is used as catch-all term to describe a discipline which uses research data and evidence from the way people interact with services, systems and products to improve them by design.
UX research and design methods and techniques have a track record of guiding improvements in the primary virtual learning environment of the University of Edinburgh, Learn. In a multi-year project, ‘Learn Foundations’, a combination of UX techniques (including top tasks survey research, semi-structured interviews, card sorting and usability testing) were used to uncover which tasks mattered most to students and staff when using Learn, to identify ways to make it easier for them to complete these tasks and to test and validate these improvements. Follow-up projects made use of UX methodology to ensure Learn continued to support students and staff with their online learning and teaching tasks during the pandemic, and when Learn upgraded to a new product, Learn Ultra.
UX design is commonly associated with UI (User Interface) design. Since user experiences typically play out on digital user interfaces, there is a misconception that creating a good user experience begins and ends with the interface alone. In reality, however, good user experiences are made up of multiple contributing factors. In the context of online learning and teaching, good user experiences for students depend on, for example, the content being consistently written in a way they can understand, whereas good user experiences for staff depend on them having all the information they need to be able to carry out routine set-up tasks efficiently. Taking this into account, previous UX work on Learn has taken a holistic research approach, with a focus on ensuring a deep understanding of the perspectives, working methods, expectations and perceptions of both students and staff before making recommendations for improvement. The UX research programme for the LOUISA project was no exception.
UX research shapes LOUISA
In contrast to previous projects, however, the LOUISA UX strand adopted a concentrated focus on one particular process – the end-to-end assessment and feedback workflow – beginning at preparation for assessment and ending when marks were confirmed and added to students’ academic records and feedback was made available to students. Initial research with staff made use of mapping to visualise the assessment and feedback procedures followed in different Schools, which revealed processes that had been refined, modified and customised over time to suit the needs of specific types of course and related assessments.
To understand the student perspective, semi-structured interviews (whereby a facilitator engages interviewees in a retrospective conversation over an interface to tease out detail of past experiences) were conducted with a total of 25 representative students. Together, these data gave an accurate picture of the current state of assessment and feedback, including details of parts of the workflow that worked well and information about areas that could be bettered.
Prioritisation is an important part of any UX research programme to ensure that data gathered in research is directed towards making the most-needed improvements. For LOUISA, prioritisation needed to occur from the start to ensure an initial focus on improving processes for the most widespread types of assessment, namely: text-based, group and media assignments. Later in the UX research programme, it was important that those with expert knowledge of the nuances of assessment and feedback processes guided prioritisation, to avoid the risk of making changes to fine-tuned processes that could have had unwanted or detrimental consequences.
To achieve this, a LOUISA UX workstream was set up – comprising a group of staff with experience of assessment and feedback. Regular meetings of this UX workstream were facilitated by the UX team, to inform staff in this group about UX activities, and importantly, to enlist their expertise in synthesising and analysing data from the student semi-structured interviews leading to the prioritisation of three student-centred aspects of the assessment and feedback process to focus on improving in Learn:
- Finding information they need to prepare for an assessment
- Knowing where to submit their assignments
- Locating their marks and feedback.
With areas for improvement prioritised, ideation is the next step of the UX design process.
Developing ideas to enhance user experiences
For LOUISA, the UX team worked with members of the Learn service team to come up with: a prototype Learn course and three proposed assessment and feedback workflows (one for text-based, one for group and another for media assessments) for staff to follow that together addressed the identified problem areas. It was important to test each of these suggested solutions with students and staff to find out if they had improved their respective experiences. Usability testing was used to assess the student experience. This technique involved students representing different Colleges being recruited to use the prototype course to complete typical assessment and feedback tasks – including finding information to prepare for an assessment, submitting an assignment and finding their marks and feedback.
Testing the proposed workflow solutions with staff required a different approach to usability testing, since typically, staff in various roles (for example, Course Organisers and Teaching Office staff) carry out assessment and feedback tasks at different stages of the workflow. Instead of usability testing, therefore, staff were asked to review the workflow as a whole, to provide input and feedback on the aspects relevant to them.
Results of the student usability testing suggested that the proposed solutions had made it easier for students to know where to submit their text-based assignments, and to find their marks and feedback. However, there was still scope for improvement to help them find information they needed to complete their assessments – in particular, their assignment question and marking criteria – and to submit media-based assessments.
Feedback from staff in their reviews of the proposed workflows indicated that the processes being suggested were largely in alignment with the way staff tended to work, with staff reviewers providing detail of additional good practices they had developed to smooth aspects of the process, which they felt could be appropriate to include in guidance to be made available for all staff, to help colleagues completing the same tasks in different courses, in different Schools and Colleges of the University.
Making recommendations
Taken together, the data from testing with students and reviewing with staff culminated in the LOUISA UX research programme making recommendations that centred on improving assessment and feedback experiences through the adoption of consistent conventions to specific aspects of the process. Rather than advocating to remove complexity where it had been shown to be necessary due to course and School level nuances and set-ups, incorporating seemingly small practices like uniform placement of assessment information and standardised labelling of folders were deemed to have a significant impact on the student experience as they completed their assessment tasks and sought their marks and feedback.
To smooth the staff experience of facilitating assessment and feedback, the provision of clear, intuitively written guidance was recommended, to aid staff adopting consistent conventions efficiently without having to rely on memory when needing to implement multi-step processes while working within the remits of established policies and time constraints.
Previous UX initiatives aimed at improving student and staff experiences of the University’s virtual learning environment have been followed by a period of early adoption, where different teams have the opportunity to try out the proposed solutions in the context of real-life courses to enable further tweaking and improvement. Given the underlying subtleties associated with assessment and feedback processes revealed in user research, the Early Adopters Programme for LOUISA presents a particularly important opportunity for continuous, iterated improvements to be made to assessment and feedback workflows, using data from authentic courses and assessment contexts.
If you would like to get involved with the LOUISA project, contact learnfoundations@ed.ac.uk.
You can keep up to date with news from LOUISA here: Project News
Emma Horrell
Emma is the User Experience (UX) Manager within ISG. Her interests lie in user-centred design of content, and related disciplines, including User research, content design, service design, systems thinking and development, data analytics and AI development. She has co-chaired the UCISA UX Community of Practice since it began in September 2021 and sits on the Drupal leadership team as UX Manager (Drupal Core) and UX Research Lead (Drupal CMS).

