Any views expressed within media held on this service are those of the contributors, should not be taken as approved or endorsed by the University, and do not necessarily reflect the views of the University in respect of any particular issue.

Usability testing highlighted ways to improve students’ assessment and feedback experience in Learn

In May 2025, the User Experience (UX) Service continued a programme of user research for the LOUISA project by testing how easily students could use a prototype course in Learn to complete key tasks relating to assessment and feedback. This research helped prioritise areas for improvement. Previous student research and input from staff helped inform the structure of the prototype course.  

This blog is written by Mel Batcharj and Katie Spearman from the UX Service.

LOUISA project background 

LOUISA (Learn Optimised for In-Course Submission and Assessment) is a project aimed at improving the experience of assessment and feedback in Learn (the University’s virtual learning platform). By adopting a set of foundational principles, the project focuses on removing unnecessary complexities and inconsistencies, as well as prioritising user-friendly processes and efficient systems. 

Find out more about the LOUISA project: 

UX research has been embedded in the LOUISA project from the start 

Building on the successful application of UX research methods in previous Learn projects, such as Learn Foundations, a programme of UX activities was defined for LOUISA. The key aim of the research was to uncover current problems and to use the acquired data to come up with potential solutions. Feedback from testing these solutions with students and staff could then be used to drive improvements of assessment and feedback experiences.  

Read more about the UX approach in a previous blog post by Catherine Munn: 

A UX strategy to improve the course assessment experience for staff and students 

Findings from student interviews provided direction 

UX research (in the form of student interviews) identified key pain points experienced by students in Learn’s assessment and feedback processes. A separate blog post explains more detail about the interview process and the associated findings: 

Conducting a second round of student research to understand students’ experiences of in-course assessment in Learn

Three stages of the assessment and feedback process were prioritised 

Student interview data was synthesised with support from School and College staff. This identified and prioritised three stages in the assessment and feedback process that students found problematic, to be improved:

  1. Finding the information they need to prepare for an assessment  
  2. Knowing where to submit assignments in Learn  
  3. Locating their marks and feedback in Learn 

Further information about the prioritisation process is detailed in this blog post: 

Working with University colleagues to prioritise student research findings for the LOUISA project 

Focusing on these areas, ideation took place between the UX and Learn teams to come up with ways to address the identified problems. A proposed prototype course was then built by the Learn team to be usability tested with students (facilitated by the UX team). 

Usability testing with students assessed proposed solutions 

Usability tests are a UX technique where participants are asked to imagine themselves in a scenario and to use an interface (in this case Learn) to perform particular tasks. Participants are observed by a facilitator as they interact with the technology to complete the tasks. This helps identify whether the technology adequately supports users or whether the user experience can be improved. Usability tests are typically carried out with around five participants, which is sufficient to identify areas that are working well and those to be improved.  

For LOUISA, the overarching goal of the usability tests was to identify any usability issues students encountered when preparing, submitting, and finding feedback in Learn as part of workflows for text-based (including essays), media and group assessments. More specifically, the testing aimed to:  

  • reveal more detail about students’ experience of in-course assessment and feedback in the context of them interacting with a prototype Learn course
  • identify if the proposed solutions improved the way students were able to interact with Learn when proceeding through typical assessment workflows. 

A prototype of a Learn course was built to test with 

The proposed solutions were encapsulated in a prototype course built by the Learn team. A generic topic (‘The Progression of Pop Culture’) was chosen to ensure it was understandable to all test participants. Students were asked to interact with the prototype course and given a range of task-based scenarios relating to submission of and finding feedback for text-based, media and group assignments. Observing how students completed these tasks enabled the team to examine how effectively students interacted with the prototype course. Key features of the prototype course were informed by the most-up-to-date good practice guidance and included: 

  • assessment folders arranged with each assignment in a separate folder containing assessment criteria and submission details 
  • the assignment question placed with the assessment criteria  
  • rubric information placed in the marking rubric section of Learn.

 A two-drop box arrangement was used to differentiate between on-time and late submissions for the purposes of the usability tests. 

Usability testing of this prototype course with students allowed simulation of Learn assessment and feedback tasks, to pinpoint usability issues and their frequency, and gather user feedback. 

Assessment folder in the Learn prototype, which is expanded and shows the Assessment and Feedback Information document along with five assignment folders

Screenshot showing the placement of assessment folders in the Learn prototype course.

Student usability testing results 

Nine student participants (including both undergraduate and postgraduate, across the three Colleges, on campus and online) were recruited for the tests with the help of colleagues within the LOUISA project. The student participants were asked to imagine they were working through the prototype course and to complete the following eight tasks relating to assessment and feedback for the course.

  1. Find the date and time that an assignment (for a given course) was due  
  2. Find the assignment task or question for the same course  
  3. Find the marking criteria for the assessment  
  4. Submit a text-based assignment before the due date using the correct naming convention and check submission was successful 
  5. Submit a media-based assignment before the due date 
  6. Find marks and feedback for a previously submitted individual assignment 
  7. Submit an assignment past the assignment deadline with an extension confirmed 
  8. Find marks and feedback for a group assignment  

Students were able to complete five out of eight tasks without issue 

The student participants were able to complete the following tasks without any problems: 

  • Finding the due date and time for the assignment 
  • Submitting a mock text-based assignment using a Word document 
  • Submitting an assignment with an extension 
  • Finding marks and feedback for an individual assignment 
  • Finding marks and feedback for a group assignment 

Although students successfully submitted a mock text-based assignment during testing, many did not complete an additional part of the task – which required them to correctly name the file. For this, many overlooked the on-screen instructions, resulting in incorrectly named submissions.

Students found three tasks more difficult to complete   

Three tasks posed more difficulty for the student participants. 

Finding the assignment question 

The prototype course had included the assignment question within the ‘Assessment Criteria’ following the most up-to-date guidance.

LOUISA good practice guidance (requires login) 

Six out of nine students completed this task, but three experienced difficulties:

  • Two navigated to the right area but couldn’t locate the question. 
  • One wasn’t familiar with this set up of folders since they were used to submitting assignments through Pebblepad. 

These findings affirmed that the placement and location of critical information needs to be carefully considered. The fact that most students found the information indicated that the chosen placement was sensible and meaningful, however, the fact not all participants found it affirmed that individual student expectations vary based on previous experience. 

Taken together, the results suggested that there was an opportunity to provide more nuanced support (if required and requested by Schools for specific courses), which could be more effectively addressed in authentic real-life contexts – for example, through engagement as part of the Early Adopters Programme.  

Finding the marking criteria for the assessment 

To complete this task successfully students needed to find the marking rubric section of Learn. Four out of the nine students were able to complete the task. Five were unable to find the marking criteria for the assessment, with four navigating to the University’s Common Marking Scheme, thinking this was the specific assignment marking criteria. 

This indicated that not all students were familiar with rubrics, possibly due to inconsistent use of rubrics within course assessments.

Submitting a media-based assignment 

To complete this task successfully, students needed to read the guidance (linked from the assignment instructions) on submitting a video for a Learn assignment and then talk through the process they would follow.  

The guidance provided step-by-step instructions on how to (i) upload a video to the student’s Media Hopper Create account and (ii) retrieve the video within the Learn submission area via the Content Market attachment option. 

Three out of nine students successfully completed the task – they noticed the link to the instructions and said they would follow them to submit a video assignment.  

Six out of the nine did not notice the instructions and therefore struggled to describe the process they would follow to submit a video assignment. Investigating the Learn interface, all six found the correct drop-box to submit to, however, they did not recognise Content Market as the correct option to select to successfully submit their video assignment. Prompting from the test facilitators to ask how they might use Media Hopper Create in the submission process did not help these students as they were unfamiliar with the system. 

These findings highlighted the importance of clear instructional guidance, placed prominently to support students at the submission point of the assignment process. The fact that the three students who read the instructions said that they could follow them suggested the guidance was clear, however, the observation that six students missed the link to the instructions suggested that this information could be placed more prominently, to encourage students to read and follow the instructions at the point of media-based submission.  

The testing results also indicated a need to replicate the end-to-end submission process in a real-life course context, to provide a greater level of confidence that students could follow the steps to submit a video assignment. 

Summary of findings and opportunities for improvement 

This latest round of student research builds on prior work that has shaped the current version of Learn. Insights from these usability tests have helped tease out aspects of the Learn assessment submission and feedback process that students find straightforward as well as identify opportunities to further enhance and refine the student experience. The findings can be summarised as follows.

Students found it easy to submit text-based assignments and retrieve marks and feedback  

Students seemed familiar with the process of submitting and finding feedback for text-based and group assignments. The findings from these tests suggested that Learn was supporting students to be able to do these tasks easily. 

Success of submitting media-based assignments depended on finding and reading instructions  

With the right guidance, students were able to submit media-based assignments, even if this was unfamiliar to them (based on their previous course experience). This highlighted the need for clear, concise, well-formatted, and well-placed instructions to support successful media submissions. 

Opportunities to update guidance on labelling and signposting  assessment information in Learn  

The ease with which students could have performed several tasks as part of the workflows depended on the way key pieces of assessment information were labelled and located in the Learn interface. This finding highlighted a need for: 

  • clearer, more concise support and guidance for staff (to cover areas like formatting and placement of assessment-related information within Learn) 
  • a consistent approach to use of layouts, language and terminology for assessment information within Learn (including for assessment-associated documents) 
  • better communication of assessment guidance and information to students, to ensure they were as prepared as possible when it came to beginning assessment and feedback workflows for their courses.  

Following the tests, the guidance was updated to provide:  

  • more detail on the placement of assignment question (recommending it be placed within assessment criteria and also in the submission dropbox)
  • clearer information on the placement of rubrics information.  

The test results emphasised the value of a consistent approach 

One student, enrolled on in-person and online courses across different Schools, remarked on inconsistent procedures for submitting assignments, which had initially affected their Learn experience before they had been able to fully familiarise with the system:   

I could see why it would be overwhelming to start with […] I had different information for all three [courses]. 

— Student

This was echoed by staff (consulted with as part of additional project research), who noted that increased consistency would benefit students by setting clear expectations across courses and Schools. 

[…] if there’s a bit more consistency between how we do things […] and how they can expect to see them and experience them at undergraduate level, that can only be of benefit. 

— Staff member

Next steps – Early Adopters Programme 

The findings of the usability tests have been shared with the rest of the LOUISA project team for further prioritisation aligned with the wider goals of the project. Looking forward, the project team will seek for opportunities to experiment with improvements in real-life, authentic course contexts by working closely with colleagues across different Schools and courses participating in the Early Adopters Programme planned for the future.   

Learn more about the Early Adopters Programme on the LOUISA SharePoint (requires login) 

Leave a reply

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

css.php

Report this page

To report inappropriate content on this page, please use the form below. Upon receiving your report, we will be in touch as per the Take Down Policy of the service.

Please note that personal data collected through this form is used and stored for the purposes of processing this report and communication with you.

If you are unable to report a concern about content via this form please contact the Service Owner.

Please enter an email address you wish to be contacted on. Please describe the unacceptable content in sufficient detail to allow us to locate it, and why you consider it to be unacceptable.
By submitting this report, you accept that it is accurate and that fraudulent or nuisance complaints may result in action by the University.

  Cancel