Usability tests have revealed how students and staff use Learn for hybrid teaching and learning
As part of the programme of user experience research for the latest phase of the Learn Foundations project, Learn Foundations teamed up with colleagues from Edinburgh College of Art (ECA) and the User Experience and Digital Consultancy Service to conduct usability testing.
The programme of user experience research for Phase 3 of the Learn Foundations Project sought to understand how Learn was supporting students and staff as they moved to online and hybrid modes of teaching and learning as a result of the Covid-19 pandemic.
Overview of Learn Foundations user experience research
The research programme included four different elements:
- Card sorting activity (with students). This activity provided data on how students expected to see information arranged and organised in Learn, and whether this had changed in the light of the shift to hybrid and fully online working.
- Top tasks survey (with students and staff). This survey was designed to identify users’ top priority tasks when accessing Learn. By comparing the findings with data from a previous survey carried out pre-pandemic, it could be seen whether priorities had changed in the context of hybrid and online learning and teaching.
- Semi-structured interviews (with students and staff). These were conducted to gain detailed insights into how Learn has been used to support hybrid teaching and learning.
- Usability tests (students and staff). These tests sought to gain insights into how students and staff were interacting with Learn, how this had changed in the hybrid learning/teaching context, and whether the navigation and consistency of approach, developed as part of previous Learn Foundations work, were still supporting student and staff needs. Qualitative data gathered in the usability tests complemented the data collected in interviews with staff and students and also the quantitative data from the top tasks and card sorting research activities.
Read more about Phase Three of the Learn Foundations project
Format of usability tests
Usability tests are one of the main techniques in user experience research. In a test, a participant is asked to complete tasks using a site or service while a moderator observes. As they complete the tasks they are asked to ‘think aloud’ to describe the thoughts behind their actions. Usability tests are traditionally carried out with small numbers of participants, as each test typically reveals the same issues with the site or service (‘Why You Only Need to Test with 5 Users’ – Nielsen Norman Group, 2000).
Testing sessions were conducted in early 2021 with students, academic staff and professional services staff from ECA. Three types of usability test were carried out – one type with students (five students took part), one with academic staff (six staff took part) and one with administrative staff (two staff took part). The different types of test were designed to help understand different aspects of the hybrid teaching and learning environment, to help build a full picture of how students and staff were interacting with Learn, and to show how Learn was supporting staff and students. Some of the tasks for each type of test were selected using data from the top tasks survey results, so that they reflected real-life scenarios, while other tasks were built around online tools that would typically be used as part of hybrid learning and teaching.
Participants were presented with a sample Learn course (based on one of two selected real courses) and given a scenario and associated tasks to complete.
Observations from the usability tests
Tests with students
Students were given four tasks in the test, each requiring them to find specific pieces of information:
- Find dates and times of upcoming online sessions
- Find recordings of lectures
- Find guidance on using online tools (e.g. Padlet, Miro, Collaborate)
- Find out how to submit a piece of work
Key findings from students’ usability tests:
- Students took several attempts to find pieces of information. It seemed that they felt there were many places they had to look in Learn to find information in case they were missing something. This was evident as students were often observed scrolling through course announcements which they had come to expect to be a back-up source of important information. The reliance on course announcements was reflected in the staff usability tests by a tendency for the announcements tool to be used frequently, as a way to ensure students were kept informed during online and hybrid teaching.
I would probably look at the announcements and scroll back a bit.
– Student
- Students found task 3 (finding guidance on using online tools in the context of using these tools for a group activity) particularly difficult and took the most attempts to complete this task. Factors contributing to this included unfamiliarity with the tools themselves (especially Padlet and Miro), uncertainty about where to find details of the group they were part of and expectations of finding detail of the tools in the context of the specific activity they had been given. They didn’t look for this information in the ‘Help and Support’ section of the template, despite the card sort indicating that this was where students expect to find this sort of information. Furthermore, despite the card sort indicating that students didn’t expect to find items relating to virtual classroom activities in a separate group or category, it was clear from task 3 that, depending on the set up of the virtual activity, students may have to navigate to several different locations to pick up the different pieces of information they needed. This hinted that the addition of an optional template item ‘Virtual Classrooms’ to house this information might be useful for some Schools.
Tests with academic staff
Academic staff members were also asked to complete four tasks, which complemented the ones in the students’ usability tests.
- Post dates and times of upcoming online sessions
- Post lecture recordings
- Post guidance (and links) for using online tools
- Post details of an assignment
Key findings from usability tests with academic staff:
- Some staff who took part in the tests were unfamiliar with Collaborate, having used Teams for their teaching. Others spoke of using Collaborate in different ways dependent on the activity – for example, using the ‘Groups’ tool, scheduling sessions, using standalone MyEd Collaborate (instead of Collaborate within Learn). There was also variation in the ways staff would use to communicate about Collaborate sessions – some staff said they would send links by email, others said they would place links into Course Materials.
- Announcements: Due to the uncertainty caused by the pandemic, staff were relying on announcements as a ‘belt and braces’ approach to ensuring students knew what was going on with courses all the time. Staff tended to use announcements and/or email even when adding material or information in standard locations in a Learn course.
I like to post an announcement … especially now with everyone online and I just like to remind the students what we’re doing..
-Staff member
…you know you’re duplicating things, but you just want to make sure that no matter which way the student goes into it, they will find it.
-Staff member
- Staff also turned to announcements when they needed to group several pieces of information together – for example as in task 1 – where they needed to communicate full details of an online session. This suggested that there may be some benefit to containing such information in an item called ‘Virtual classrooms’ in the left hand menu, as one of the two additional items each School is afforded the flexibility to add as part of the Learn Foundations template.
- Staff were aware of the need to thoughtfully label and arrange their lecture recordings to present these to students in an ordered way. One staff member spoke of using a separate ‘lecture series’ folder structure within the weekly folder structure, to arrange recordings. Another recognised that delivering lectures in small, chunked recordings meant there was more content for students to consume – if a student was taking three courses and had three lectures a week per course, each divided into three chunks – the student would need to navigate 27 individual pieces of content, so clear meaningful labelling, with an indication of the topic of the recording was important.
Tests with administrative staff
Administrative staff members were given three tasks to complete in the usability tests:
- Post details of a new tutor on the course
- Edit the formative assessment information for a course
- Create a submission box for a summative assessment
All three of these tasks had a defined solution and needed to be completed in a certain way, and Professional Services staff were able to easily navigate the template and complete the tasks with minimal effort, and no problems or issues. This indicated that the template was well established within ECA and staff commented that it was clear and easy to use. In particular, it was clear that there was a high level of consistency within ECA in how assessments were organised and structured in Learn.
I actually really like the template because I find it so much easier when you’re setting up a Submission Box
-Staff member
Some participants mentioned the use of announcements as reminders about submissions, which supported what had been observed in the usability tests completed by academic staff.
Conclusions from the usability tests
The usability tests were an important piece of the latest phase of Learn Foundations research. Observing students and staff using Learn to complete tasks, and hearing their associated thoughts and reasoning as they did so provided a clear illustration of what it was like for staff and students to teach and learn in the hybrid environment.
Data from the tests also brought meaning to the numbers collected in quantitative research activities like the top tasks survey and the card sort. In some cases what was observed in the tests reinforced the quantitative data – for example, where the top tasks survey showed lecture recordings were a priority for both staff and students, the usability tests affirmed this when students and staff demonstrated familiarity with posting and finding recorded lectures. In other cases, data from the two types of study were contradictory – for example, where the card sort data showed students expected to find guidance on learning technology tools in a ‘Help and Support’ category, in usability tests students were observed looking elsewhere for this information.
The fact that students and staff were able to use Learn to complete all the tasks, despite in some cases this requiring several attempts, was a strong indicator that certain key elements of the Learn Foundations approach – such as the templates – developed before the pandemic – were still valid in the hybrid teaching and learning context. While no major changes to the templates were deemed necessary, what the staff and students said and did to complete the tasks provided useful insights to help guide future Learn Foundations work, and the observation that staff took fewer attempts to complete their tasks than students suggested improvements could be made in aligning the way staff and students used Learn.
The key findings from each type of usability test, taken together with the findings from the semi-structured interviews with staff and students and the quantitative research (top tasks survey and card sort) will together help guide the future development of the different aspects of Learn Foundations approach, which will continue to evolve as the nuances of hybrid teaching and learning continue to take shape both organically and as part of the ever-adapting pandemic response.