Any views expressed within media held on this service are those of the contributors, should not be taken as approved or endorsed by the University, and do not necessarily reflect the views of the University in respect of any particular issue.

Future student online experiences

Future student online experiences

Sharing the work of the Prospective Student Web Team

Reviewing our usability test facilitation skills

My team and I recently reviewed our usability testing facilitation skills through a playback of some of our test recordings. In this post, I recap why we ran the session, how we ran it and what we learned.

Why we ran a facilitation playback session

As we were coming to the end of our collaboration project with the Student Immigration Service, Neil suggested we make some time to review our facilitation skills as a team.

The immigration project was the first experience for my team of new content designers conducting usability testing. During the project, we were focused on watching back our test recordings to identify usability issues.

Following the project, we wanted to assess how we facilitated the tests. So we would watch the recordings back and make notes on our facilitation, instead of the usability issues.

This was a great chance to discuss as a team what each of us did well and where we needed to improve.

How we ran the playback session

I asked each of the content designers in my team to pick out one usability test recording from either the immigration project or a project we did for the University’s Finance team immediately after.

I asked them to pick the test they found most challenging to run.

I even dug out a recording of me facilitating a test in 2017 to show. I thought it was only fair for my team to see what I was like when I started facilitating tests.

We followed this format at the playback session:

  • watch the test recording
  • make notes on the facilitation while we watched
  • share back both what the facilitator did well and where they could improve
  • invite the facilitator to share their own reflections after everyone else had shared their notes

What we learned

Our strengths and areas for improvement as facilitators

As was the intention of the session, we were able to focus in on some examples of good practice and some areas for improvement. It also gave me some ideas for how I might better manage the testing process in future.

I was really pleased to see how calm and often neutral each of the content designers were when facilitating. This is one of the most difficult things to achieve when testing because it can feel weird and against our instincts to reply neutrally to someone.

We also saw lots of good follow-up questions, with the content designers asking students why they performed certain actions or felt certain ways about things they said.

For our areas of improvement, we noted instances where we were:

  • asking leading questions that encouraged a specific answer (for example, asking ‘Would you look at the visa centre first to begin the application process?’ instead of ‘Where would you look first to begin the application process?’)
  • asking closed questions that resulted in a yes/no answer (for example, asking So that would be the first thing you do?’ instead of ‘What would you do first?’) 
  • rushing participants along the expected path without pausing to ask the participant questions or work out what they needed to do themselves
  • unintentionally using phrases that made participants rely on personal experiences instead of the test scenario when answering (for example, ‘Do you have an idea of how you would do that?’)
  • sticking too rigidly to the script and not allowing the participant to continue along a path that would have resulted in task failure

What we would do in different situations

Each recording prompted productive discussions on how each of us would have approached different situations.

Example situation 1

Situation: If a participant says they would have Googled something instead of using the website you are testing, do we get them to show us what they would Google?

Discussion consensus: While this is an interesting insight to get during a test, it only makes sense to see this through if the testing is during a project’s discovery phase. The aim of our development and delivery phase testing is to see how our new content changes perform.

Example situation 2

Situation: What do we do if the participant keeps recounting their own experiences, rather than keeping with the test scenario and task?

Discussion consensus: Continue to remind the participant of the scenario and participant. Acknowledge their own experience, but ask them to imagine themselves in the particular scenario completing the task.

Example situation 3

Situation: What do we do when a participant gives a visual design critique that has nothing to do with the content we developed?

Discussion consensus: Acknowledge the participant’s comments and move on. If a participant notes an accessibility issue with the design, that’s one thing. But we don’t want to dwell on someone’s comments about whether or not they like a colour scheme on a page.

Ways to improve the test scripts

While the session focused on facilitation skills, it became clear there was a major lesson for me in improving my test writing skills.

Be less prescriptive

Through watching the recordings back, I could see I was too prescriptive in the way I wrote the scripts.

As my team was new to testing, I intentionally wrote a very detailed script of the expected path the test would go in, as well as the various prompt questions to ask at certain points.

I did this because I felt it would make it easier for my team to have more details to continually refer back to, rather than feel pressured or worried if they could not think of what to say or do on the spot.

While I do feel this was helpful for a team of content designers entirely new to testing, watching back, I can see now there’s a balance to achieve in how much detail I provide.

I saw how my very detailed script was a bit too overwhelming at points. There was a lot of pausing to work out where someone was in the script or which question to ask.

Similarly, writing out the same wording for certain prompt questions (like, ‘Is this what you expected to see?’) came across as repetitive and forced when spoken aloud.

State the research goals

I think the prescriptive issue can be solved by addressing another issue we noticed: the consequences of not incorporating the research goals into the script.

I did this in our discovery user research, but I did not write out the goals in the scripts for our development and delivery phase testing. As a result, I can see how not externalising these left my team struggling to deal with more challenging test situations.

It’s easier to know what to do in these types of situations when you have the research goals in mind and you know what you’re looking to achieve. A script can’t account for every scenario that might happen. 

In future, before writing the script, it will be beneficial to work with my team and the subject matter experts in the service we are collaborating with, to come up with the research goals. This will help me write scripts that feel less like a prescriptive manual, and more like a guide that my team have contributed to creating. 

Moreover, being explicit about the research goals in the script can help the participant better understand what is expected of them.  

For instance, in the examples 1 and 2 described in the last section, these situations could have been avoided if I was clearer in the script what we were hoping to achieve with the research session.

In the script, I could have included things like:

  • “I’m going to ask you to find information about applying for a visa on a website a University department just created. We want to see how people find the experience of using the website. In your own life, you may have searched for visa information through other sources. However, for the purpose of this session, keep to using this website.”
  • (after reading the scenario)”I know you’ve been through the Student visa process before, but in this session, keep imagining yourself in the scenario where you are going through this process the first time.”

With the goals outlined beforehand and integrated into the script, I think my team would have had a better steer of how to run the session and what to do in challenging or unexpected situations.

I fully intend to act on these lessons in my future scripts and can’t wait to see the results. I was already impressed with my team’s ability to dive into usability testing for the first time. I’m excited to see the ways they grow as facilitators going forward.

Learn more about the projects mentioned in this post

Immigration project blog posts

Finance project blog posts

Leave a reply

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

css.php

Report this page

To report inappropriate content on this page, please use the form below. Upon receiving your report, we will be in touch as per the Take Down Policy of the service.

Please note that personal data collected through this form is used and stored for the purposes of processing this report and communication with you.

If you are unable to report a concern about content via this form please contact the Service Owner.

Please enter an email address you wish to be contacted on. Please describe the unacceptable content in sufficient detail to allow us to locate it, and why you consider it to be unacceptable.
By submitting this report, you accept that it is accurate and that fraudulent or nuisance complaints may result in action by the University.

  Cancel