Lessons learned from guerrilla testing
I have been conducting rounds of usability testing to guide the redesign of EASE, the University’s online sign-in system. These tests needed to be completed quickly with no time or budget for recruitment. The following is an outline of what I did, and what I learned.
When gathering user feedback, we would love the time and resources to carefully recruit representative users to spend time interacting with our designs, but it’s often not possible.
Guerrilla testing involves taking our designs out into the wild to ask users to give us feedback, as opposed to recruiting them and arranging to meet. Essentially, it involves going to public places where we might find representative users and asking them on the spot to perform tasks using a design we’ve created.
We undertake guerrilla testing regularly in the User Experience Service as we don’t always have the time or budget to recruit participants and pay them incentives. However, as we recognise that we should always involve users throughout the development of a design by any means possible, guerrilla testing is an effective means to gather invaluable insight.
Although the method is meant to be time-saving and simple, there are challenges involved. From a recent example of my own guerrilla testing, I want to share my lessons learned through the challenges I faced.
A recent example
This post is based on my experiences conducting guerrilla testing on the project to redesign EASE.
EASE is the University’s single sign-in service – used to provide access to various restricted resources (such as MyEd and Learn) for students, staff and visitors. It’s being redesigned to become more secure and aligned with the University brand guidelines and web standards (EdGEL).
I conducted 3 rounds of usability testing at different stages of the design to help ensure the process was focused on the experience of its users. These tests were fast-paced, and with no budget or time to recruit participants I had to use the guerrilla method to find and observe users.
For each round of testing, I went out into the wild to observe and record a mixture of potential applicants, current students and staff as they interacted with different design iterations of EASE.
Based on my own experience, I have created a list of the top things I learned to consider when conducting guerrilla testing.
My advice focuses on challenges related to guerrilla testing. If you are new to usability testing in general, I recommend reading some of the User Experience Services information on how we conduct usability tests effectively.
My lessons learned
- Look the part. To increase the chances of people giving up their time, I found it’s important to build trust quickly and it helps if people can easily recognise that you are part of an official organisation – without you having to explain it to them. As well as having my badge showing, I found it helped to have an “official” pop-up station set up. I used a table in the library foyer and created a sign that said “Do you have 5 minutes to help us design University systems?” with the University’s logo attached. Passers-by seemed more interested and some curious people even approached of their own accord and asked to participate.
- Think carefully about location. Guerrilla testing is supposed to be quick so you don’t want to be hanging around waiting in areas where there are not many appropriate candidates to approach. I came across this problem in my first round of testing. I had decided to seek representative users (students) in Edinburgh College of Art’s café and I didn’t anticipate how quiet it would be at the time of year. It was deflating to return to the office without capturing what I set out to get. For the next round of testing I contacted colleagues working at the Library to ask if I could conduct my tests there. They allowed me to set up in the foyer – this proved much more effective.
- Offer a small incentive. One of the reasons to guerrilla test is due to budget limitations, but it should always be possible to find something to offer participants, however small. Even something as simple as having a box of chocolates on hand can make a big difference. In the second round of testing, I asked around the office and found a spare £10 gift voucher. I then told passers-by they would have a chance of winning it, which helped to generate interest.
- Know your kit. When the User Experience Service conducts usability testing, we aim to record the session (usually screen and audio) so we can run collaborative playback workshops for the project team. When testing EASE, I used a free mobile app called “Capture” created by TechSmith. The app seemed straightforward at first, but I soon realised I didn’t know its functions quite well enough, and there’s little worse than thanking a participant for taking part in a test before realising that it wasn’t recorded. I learned the hard way. When doing rapid testing you need to know the tools you use inside out – there’s less time for fiddling about.
- Plan what to say. Guerrilla testing is supposed to be fast and cheap, but to optimise your chances of success it’s important to plan and rehearse how you are going to approach people. In my experience, when I approach somebody there’s a very short window to interest them before they politely say “no sorry” and move on. When I started my tests on EASE, I didn’t put much thought into this and I found it difficult to get people’s attention. By the time I had blurted out something like, “Excuse me, I’m working as a User Researcher for the User Experience Service at the University of Edinburgh and I’m conducting some Usability testing to determine how representative users interact with…”, the person was long gone. Over time my approach became more precise, containing a friendly greeting, a short summary of what I wanted and what’s in it for the participant. So I would say something like, “Hello, I work for the University. Do you have 5 minutes spare to interact with a web application we’re designing? You could win a prize.” This approach was better at getting people to initially engage with me. There’s much more room to explain the details once you have their attention.
- Remember your motivation. Guerrilla testing can be deflating as some people are inevitably going to turn you down. I found this to be the biggest challenge when I started out. Asking people to go out of their way to do something for free can feel a bit cheeky and not everybody will be willing. This can make It hard to approach people. In my experience, to get over this you need to stay motivated and think about the bigger picture. Why are you conducting the tests? You’re there to help the users. Often in usability testing I try to make participants feel at ease by saying something like: “Remember it’s not you we’re testing, it’s the design.” When conducting guerrilla testing I have learned to remind myself something similar: it’s not personal.
- Try not to push people. Perhaps the word ‘guerrilla’ is misleading as its origin comes a style of warfare in which smaller forces use irregular tactics to surprise and harass larger forces. However, we don’t want to annoy people (I hope), so it’s important to be respectful of people’s time. Taking obvious steps such as not approaching people who look like they’re working or on the phone should go without saying, but I also learned not to push people who say no. I found that participants who had to be convinced to do a test, after they originally expressed they didn’t want to, were the least interested or engaged in completing the tasks effectively. On the other hand, the participants who genuinely want to help tend to get behind the scenario being presented to them.
- Have fun. Guerrilla testing can be a daunting prospect, but it shouldn’t be. The final lesson I learned was to enjoy the process. User Experience Design is about people more than anything, and it’s refreshing to be out of the office learning from the real people who encounter the products we design.
During my work on the EASE project, guerrilla testing proved to be invaluable as I needed to conduct fast-paced iterative testing, with no budget, throughout the life of the project.
Over the course of 3 rounds of testing, more than 30 people gave up their time to participate and I managed to record over 3 hours of footage. The footage was used in collaborative playback sessions in which the project team prioritised the usability issues they observed. This gave us the confidence to quickly build something that we believed would satisfy our users.
However, just because guerrilla testing is quick doesn’t mean that it’s plain sailing. There’s a lot to think about and it takes careful consideration and practice. Good planning leads to good research, and I hope the lessons I’ve learned help you in your future guerrilla endeavours.
More from this series
This is part of a series of blog posts showcasing some of the tools and techniques used by the UX Service. The aim is to inform the wider user-focused community that adopting these tools and techniques is simple and extremely beneficial.
Get in touch
If you’d like to find out more or bring the UX Service on board to help you better understand your users, get in touch.