Any views expressed within media held on this service are those of the contributors, should not be taken as approved or endorsed by the University, and do not necessarily reflect the views of the University in respect of any particular issue.

How to avoid common mistakes in user engagement

Have you ever participated in a user engagement session designed for you to share your views, but felt that you weren’t properly included, or that your views wouldn’t be acted on?

Most of us want to engage with our users and stakeholders. We all want to make sure our users have a voice in projects that will affect them. But the approach you take can have a major effect on the success or failure of your engagement.

It is tempting to use common techniques like focus groups and surveys. Most people think they understand these techniques, and find them easy to set up.

But if you use these approaches badly, you risk getting misleading information, wasting people’s time, or being seen to be just paying lip service to including users.

Erika Hall has written excellently about the dangers of badly planned surveys:

What makes a survey bad? If the data you get back isn’t actually useful input to the decision you need to make or if it doesn’t reflect reality, that is a bad survey. This could happen if respondents didn’t give true answers, or if the questions are impossible to answer truthfully, or if the questions don’t map to the information you need, or if you ask leading or confusing questions.

Often asking a question directly is the worst way to get a true and useful answer to that question. Because humans.

Survey fatigue is real.

For example, according to the 2020 National Student Survey results, 85% of University of Edinburgh students felt that they had the right opportunity to provide feedback on their course. But only 45% said it was clear how their feedback had been acted on.

How do we avoid this trap?

Of course, this doesn’t mean you shouldn’t do user research. But you do need to take a step back. Rather than creating another knee-jerk survey or focus group, consider the following:

  • Who are your users?
  • Why do you want to engage with them?
  • What do you need to find out about them?
  • What’s the best way of finding those things out?

Surveys or focus groups are often not the best way of finding out about our users. This is why in the User Experience Service we conduct user research the way we do.

Basic truths about human behaviour

There are some basic truths about human behaviour that we know from psychology and other social sciences. But in many projects, these basic truths tend to be ignored.

You are not your user

Many people make unfounded decisions on behalf of their users, when they don’t adequately understand them. When you do that, you tend to design for your ideal user: yourself.

But your users don’t think like you do. For example, teams often overestimate their users’ technical skills. They overestimate people’s inclination to learn how to use a new system. They overestimate users’ interest in their team or service itself, when most people just want to get something done.

This is how we end up with dozens upon dozens of differently-branded systems, with different names, different interfaces, and a totally disjointed experience. This is how we end up with web content being trapped in departmental silos, rather than being organised in a way that users can understand.

In short, this is how we end up bouncing our users from system to system, from department to department, making it difficult for them to complete basic tasks.

Don’t design based on what makes sense to you.

Do make your design accessible, inclusive and understandable to non-experts.

The distribution of users’ computer skills: Worse than you think — Jakob Nielsen

We are all prone to bias

There are over 100 cognitive biases. We are all susceptible to them. When we’re not careful, we bring our biases into requirements gathering, and users respond accordingly.

In research, it can be difficult to avoid leading questions. If you are too close to your designs, you can end up attempting to defend your work rather than using the opportunity to gather valuable information about your users. We don’t even notice when we are being biased — otherwise we’d be able to avoid the bias!

Even subconscious cues such as saying “yes” or “good” can suggest you are happy with a participant’s responses. This can unwittingly lead your participants further, skewing your findings towards what suits your own point of view.

Bias is impossible to avoid completely. The best thing you can do is be more aware of your biases, and plan your research carefully.

It can be particularly hard to control a discussion in a focus group or an interview. But you can use a script or conversation guide with carefully-worded questions to keep yourself on track.

Don’t ask leading questions that only give you the answers that suit you.

Do be more aware of your own biases and work hard to mitigate their effects by planning your research carefully.

Design for cognitive bias — webinar by David Dylan Thomas

People don’t understand their own behaviour

Typically, we don’t have good insights into the reasons for our own behaviour. When we speak to users, they make up stories that reflect well on themselves. But we often don’t validate those stories against what actually happens.

People like to think that they make decisions on a rational basis, and that they can exactly explain why they behave the way they do. The reality is that people are a lot messier than that.

If you ask users if they would like to see a certain feature in your service, they almost always say yes. This is because you’ve put the idea into their head. They literally have to imagine using it in order to answer your question.

This is one way products become bloated with features. But this phenomenon can have even more extreme consequences.

In 2009, Walmart asked their customers in a survey: “Would you like Walmart aisles to be less cluttered?” Respondents answered overwhelmingly: yes. So Walmart spent hundreds of millions of dollars decluttering their stores.

The result? Walmart lost over a billion dollars in sales. It turned out customers didn’t want less cluttered stores at all. They just thought they wanted less cluttered stores.

Don’t ask people to explain their preferences and behaviour.

Do observe what people actually do.

Ignore the customer experience, lose a billion dollars (Walmart case study) — Mark Hurst

Users can’t predict their own future

Inexperienced user researchers can fall into the trap of asking users to predict how they would use a feature in the future. But what people think they might do in the future is often very different to what actually happens.

Consider opinion polls in the run-up to an election. These are often relatively poor predictors of an election result. However, exit polls tend to be extremely accurate.

One of the reasons for this is that opinion polls ask people to predict how they might vote in the future. Meanwhile, exit polls simply ask people to repeat how they have just voted, seconds after they have stepped out of the polling booth.

This gives a clue as to what you can do instead of asking users to predict their future. The best way to understand what people will do in the future is to observe what they have done in the past. This way you can understand people’s real behaviour, not an idealised and inaccurate version of their behaviour.

Don’t ask people what they’ll do in the future.

Do understand what people have done in the past, to understand what improvements can be made for the future.

The paradox of knowing — David Dunning

Users’ behaviour depends on context

People behave differently in different environments. We all face different pressures at different times in different circumstances. Our behaviour adapts accordingly.

If you wanted to understand an animal, you wouldn’t observe them in the zoo. You’d observe them in the wild.

The same is true of people. Bringing people to a usability lab or into your own office for a focus group automatically alters their behaviour.

Focus groups are also notoriously difficult to control. Bias and groupthink creep in all too easily, making the findings dubious.

Of course, any kind of social research suffers from the observer effect. But you should aim to minimise this as much as possible.

This means researching people in their own context — in their own space, on their own device — not in a focus group setting or usability lab.

The good news is that you can still get good results researching remotely.

Meeting the challenges of conducting user research remotely

Don’t bring users into your own office or a usability lab.

Do understand people in their normal context — their own space, their own device.

The user researcher’s field guide to psychology — David Travis

Find out more

For more about lessons from psychology you should understand if you want to be truly human-centred, start with the following resources:

For further information on the user research techniques we recommend, and how we can help, visit the User Experience Service website:

User Experience Service

Thank you to Nicola Dobiecka, who originally provided the inspiration for this post.

Nicola Dobiecka’s website

1 replies to “How to avoid common mistakes in user engagement”

  1. Jamie Cockburn says:

    My favourite thing about this post, which states that:

    teams often overestimate their users’ technical skills

    is that the comment box requires you to understand this:

    You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

    Which I have had to type out as:

    My favourite thing about this post, which states that:
    <blockquote cite="https://blogs.ed.ac.uk/website-communications/how-to-avoid-common-mistakes-in-user-engagement/"> teams often overestimate their users’ technical skills</blockquote>

    is that the comment box requires you to understand this:

    <blockquote>You may use these HTML tags and attributes: <code>&lt;a href=&quot;&quot; title=&quot;&quot;&gt; &lt;abbr title=&quot;&quot;&gt; &lt;acronym title=&quot;&quot;&gt; &lt;b&gt; &lt;blockquote cite=&quot;&quot;&gt; &lt;cite&gt; &lt;code&gt; &lt;del datetime=&quot;&quot;&gt; &lt;em&gt; &lt;i&gt; &lt;q cite=&quot;&quot;&gt; &lt;s&gt; &lt;strike&gt; &lt;strong&gt;</code></blockquote>

    Which I have had to type out as:

    <code>
    recurse here
    </code>

    Including having to know how to HTML escape all the <code>&lt;</code>s and <code>&gt;</code>s (twice). <strong>And I have no idea if this will end up formatted as intended!</strong>

    Including having to know how to HTML escape all the <s and >s (sometimes twice). And I have no feedback to tell be if this will end up formatted as intended!

Leave a reply

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

css.php

Report this page

To report inappropriate content on this page, please use the form below. Upon receiving your report, we will be in touch as per the Take Down Policy of the service.

Please note that personal data collected through this form is used and stored for the purposes of processing this report and communication with you.

If you are unable to report a concern about content via this form please contact the Service Owner.

Please enter an email address you wish to be contacted on. Please describe the unacceptable content in sufficient detail to allow us to locate it, and why you consider it to be unacceptable.
By submitting this report, you accept that it is accurate and that fraudulent or nuisance complaints may result in action by the University.

  Cancel