Pop-up usability testing: a beginner’s guide
Usability testing is a core activity in our team that all content designers are expected to help with. I only started a year ago, but I’ve already been involved in four rounds of pop-up testing, and each time I’ve learned a bit more and grown more confident with the technique. I want to share a recent experience and my seven top tips that can help you get quick feedback on your website.
Context for the research
We recently did some in-person usability testing around campus as part of the development of a temporary Degree Finder front-end, which will involve changes to information architecture and navigation.
The critical change would be the removal of the side navigation bar.
We wanted to see if these changes would have a detrimental impact on students navigating through the site.
My colleague Nicole Tweedie and I had responsibility for recruiting students and executing the tests, then running a playback session for the team.
Planning
We had an introductory meeting with Neil Allison (Head of Student Web Content) and Aaron McHale (PSWC Software Analyst/Developer) to give the background to the project and set the scope for our role within it.
Neil provided the script a few days before the testing. This gave us sufficient time to edit the script and get familiar with the tasks.
Tip 1: Practise the test script so you’re confident with what you have to say. Run a pilot session with a colleague who knows nothing about your work so you can iron out any problems before you test with students.
We used a Miro board to produce the images with the changes to the website which we would be using in the testing. Nicole and I also kept up a regular dialogue to decide on the practicalities of testing, for example, choosing the location and time frame.
Tip 2: You don’t need a website to be able to run tests, although it’s easier to use a live website when you start out.
This was my first time testing with screengrabs of web pages, which introduced some new considerations. We would have to explain to participants that we were going to show them pictures of web pages but that they would need to point and tap, just like they would on a real web page.
Executing
On testing day, we had an hour or so before we went out to make sure everything was prepped. This included making sure all the equipment was fully charged and working, and printing off consent forms.
Tip 3: It’s important to get consent from participants before you record them. This can be signed off on a pre-printed sheet, or you can record them agreeing to the statement you’ve read. Make sure you say why you’re recording, who will see the recording and how long you’ll retain it.
From our planning, we decided that the best place to find students would be the Main Library in George Square. However, it was clear with exams taking place this wasn’t the best location to test students. So, we adapted and walked around campus to find willing participants.
Tip 4: Use an iPad with a screen recording app to perform the testing. We use “Techsmith Capture” and it’s free! You just need to hit ‘record’ and the app will record what happens on the screen and the conversation between the participant and the facilitator. After you’ve finished, the test is uploaded to your device.
We structured the test as follows:
- Introduced ourselves and asked students if they would be interested in sparing five minutes of their time to “help improve the university website”. We phrase the question like this to avoid mentioning the word “test”
- Told our participants that our session would be recorded and got their consent
- Explained how the test would play out, including the background on what we would show them. We asked them to talk us through their decision-making while looking at the images
- Started recording then showed the participants an image of the Undergraduate Study site and asked some introductory questions. We asked the students:
- to identify where on the University site they thought they were
- how they would search for a programme on that page
- Took students to the second page, a search results page, assuming they had chosen the search box on the previous page to find the programme
- When on the programme page, we asked the students:
- basic information, like “How long does it take to study this programme?”
- crucial navigation questions, like “What would you tap next to find accommodation information?”
- prompts back to the original page, like “If you wanted to get back to the first page you encountered so you can do this, what would you tap?”
Tip 5: Offer a small incentive – we offered chocolates to participants during this round of testing. It was a nice ice-breaker to recruit participants.
Reviewing
When we got back to the office we uploaded all our recordings to our team SharePoint.
The first part of the review process began with listening to the recordings and taking notes. This allowed us to assess our findings and led us to change our views on what we concluded while testing.
We moved on to discussing our findings with a small group before we presented at the playback session. Having this opportunity to discuss our findings with other members of the team gave us clarity on what to present to the extended group.
Finally, we presented to the whole team as a playback session. We picked three example tests to present to the team, briefly outlined our findings and then fielded any questions from the team.
My experience
As someone relatively inexperienced at conducting usability testing, I felt I learnt a lot from the experience and realised I’ve got plenty of areas to improve on:
- I should avoid asking leading questions, as this can influence how a tester behaves
- I need to be more reactive when testing and listen carefully to what the participant is saying
- I also need to improve how I prepare before testing. For example, thinking of some follow-up questions could help me gather more insight from the participant
Tip 6: If you’re new to this, test as a pair. Having a colleague alongside you makes the experience less daunting. They can help you with recruiting, getting the device ready and asking any follow-up questions.
Overall, though, I think we stuck to the task well. Although we got rejected more times than we managed to recruit participants, we persevered and achieved the number of tests we set out for.
Tip 7: Have fun! This exercise can be great for team building and as someone who doesn’t interact with students very often within my role, it was energising to have a change of scene.
Read Nicole’s blog post about what we learned from the testing
1 replies to “Pop-up usability testing: a beginner’s guide”