Regular student engagement informs iterative improvement of the beta
Testing what we develop with prospective students is fundamental to delivering a new website that is easy to use and meets their needs. I’ve been responsible for making sure the team gets regular sight of students interacting with our work-in-progress through the course of the development leading up to our beta release.
Fitting testing into the cycle of agile development sprints
Our agile development sprints are two weeks long. This is a challenging timeframe in which to plan, deliver and playback a round of usability testing. We decided that we would conduct around of usability testing every other sprint.
In one sprint, I would work with the team to agree the session plan and goals of the usability testing. In the following sprint, I would run a round of usability testing and collect insight to playback to the team for a collaborative review. We would then agree the actions we wanted to take arising from what we had learned and decide what our next focus of testing would be.
What we tested in 2023
My work with students was focused on ensuring we validated our assumptions about what they would find useful and useable. This approach minimised the risk that we would build something that might need to be reworked.
Rounds of discussion, design and testing helped clarify our requirements. It gave our content designers confidence in what they were specifying and our developers confidence that they were building the right thing.
The areas of testing that I focused on during the sprints we ran from November 2022 to May 2023 were:
- in-page navigation features for mobile and desktop
- interaction design patterns for entry requirements
- changes in content format to improve the findability of information on the page
These areas all involved multiple rounds of design evolution and testing, and were often done in parallel, so I could recruit students to test multiple things at once.
In-page navigation features for mobile and desktop
We needed to design an in-page navigation component for both desktop and mobile that would help users understand what information was on the page and make it easy to access.
We began by testing different components from other universities that represented a range of different approaches. From this we learned which navigational elements worked well and which ones caused students problems.
Due to limitations of the University Design System and the (then) maturity of plans for website navigation in EdWeb2 (the corporate content management system we’re building on), we were unable to build the in-page navigation components that tested the best on desktop and mobile. Instead, we designed and tested others that found a balance between an approach that ensured users could find the information they wanted but within the technical and design constraints we had to work within.
This process saw us conduct six rounds of usability testing with students and staff that helped us evolve from other university designs to low fidelity sketches of our own ideas and ultimately to the design we implemented in our beta programme pages.
Interaction design patterns for entry requirements
From prior research we conducted through our design sprints, we learned that presenting all entry requirements information to all students on a programme page made it difficult for them to comprehend which requirements they needed to meet.
When presented with a range of grades, UK students typically thought they could apply with the lowest ones (which is not always the case). They were also confused with the language we were using around standard and minimum entry requirements.
From international students, we learned that they often found it difficult to find the right information on entry requirements. We watched them complete confusing circular journeys as they navigated from the degree programme page to our external sites only to brought back to the place they started and without the information they were looking for.
We set out to design a new entry requirements interaction pattern that asked students to select the country they studied in and the qualification they studied so that we could present them with the information that was specific to what they had selected.
We went through three evolutions of design for this new pattern, each being informed by what we learned from testing the previous one. This helped us improve the layout of the interface and the language we used.
The most significant improvements we made were around the language we used for standard and minimum entry requirements. In early testing, we saw students select the minimum option because they thought they could apply with those requirements. By replacing this term with widening access, we saw every student select the correct entry requirements based on their circumstances.
We also improved the experience for international students by providing them with information specific to the country they selected, which removed the confusing circular journeys we previously saw.
Changes to content format to improve the findability of information on the page
Because we were unable to implement the in-page navigation component that tested the best, we had to present all programme information on a single page. We saw that students were having difficulty understanding where they were on the (very long) page, which made it challenging for them to find the information they wanted quickly.
The way that current University web pages are formatted contributes to this problem as they don’t adhere to best design principles.
We designed a programme page that did adhere to best design principles by following the standards set out by the Government Digital Service (GDS). The way content is formatted using the GDS approach increases the weight of header sizes and the spacing around text which makes it easier for users to scan through large amounts of information to find what they are looking for.
We tested two prototype pages, one with the current University format and one with subheadings following the GDS format.
We asked each student to complete some tasks on each page and alternated which page they saw first and the tasks they were asked to complete. We learned that all students preferred the GDS-style presentation. They noted how much easier it was to scan the page and identify the different section of it.
Beyond the beta
We retired the beta in January 2024, although we continue to use the pages for usability testing while it wasn’t publicly available. This research has focused on questions arising from our Performance Analyst Carla’s review of the data we collected while they were live.
This work is informing design enhancements that will be developed in upcoming sprints through to the summer:
- improvements to our in-page navigation component on mobile, which will help students more easily complete the most common information finding tasks on a programme page.
- changes to how we style a programme page to make it easier for students to identify which section of the page they are in
Learn more about the future provision
Our summary of findings arising from the release of the beta – Carla Soto’s presentation and summary