Death to “signposting”
A term I hear too often in meetings about digital content is “signposting”. In my view it’s a cop out and it’s incredibly damaging to the student experience. We need to focus on students’ task success instead.
What is signposting?
You may have heard the phrase used yourself. You may even use it.
Here’s how it typically goes…
Colleague 1: “So we need to get that email out to [audience] by the end of the week”
Colleague 2: “Can we make sure that we include something on [secondary topic]?”
Colleague 1: “Yes, we can signpost to [department website]”
Me: “Why have we got all these links in this webpage when the point here is to encourage students to [complete specific task]?”
Colleague: “We thought that students might also want to do [other tasks] so it would be helpful if we signposted them from here.”
What’s the problem?
We want to be helpful. We want students to be independently successful. Right?
So what could be more helpful than letting them know about all the content they might want to read on the University website that might be useful or of interest to them? Actually, quite a lot.
The thing is, we’re task driven.
Our goal is never to just read University web content. It’s to get stuff done. To complete a task, to get a step closer to achieving a particular goal.
“Don’t manage the technology; don’t manage the content; don’t manage the information; and don’t manage the graphics. Manage the tasks.”
Gerry McGovern, Pioneer of top task management
Gerry McGovern Says “Manage the Tasks”- article on UIE.com
Aligning on the user outcome
When we write a piece of content – perhaps a page for our website, or an email we intend to send out – we should have our target audience in mind.
I think it’s easy enough for any of us to say we’re writing for prospective students, for applicants, for offer holders and so on. But we need to go further than this.
What’s the student’s context? What do they want or need to do? What do we want them to do?
While we often don’t have the luxury of a user researcher exploring the problem space and helping us to understand the student’s behaviours, attitudes and motivations, we can as a minimum externalise our assumptions.
We can state what we assume the student wants, and what we know we want them to do. And we can make sure that all stakeholders reviewing content are aligned on these assumptions.
At least then we are in a situation where we’re asking for feedback on whether the stakeholder/reviewer believes the content is likely to achieve the desired outcome in terms of the behaviour we assume we want to encourage.
And this then leads to agreeing what success looks like. Something that is measurable. Something we can review. Something we can learn from and inform our future practice.
My top tip: Align stakeholders on what you believe success looks like before you invite feedback.
What happens when we don’t do these things? When we don’t externalise our assumptions and align on a desired outcome?
It becomes all to easy to listen to a range of stakeholder voices suggesting extra content, extra links.
We end up with goals like “providing a signpost to…”
We’re bombarded with options and choices online all the time.
The human brain isn’t good with dealing with this. And it acts accordingly.
We don’t read things carefully from beginning to end and then make a measured decision on our next step. We often make a swift decision based on a limited amount of information. In usability expert Steve Krug’s terms, we satisfice.
…most of the time we don’t choose the best option — we choose the first reasonable option, a strategy known as satisficing…
Steve Krug, author of Don’t Make Me Think
Jakob Nielsen and his user research company Nielsen Norman Group have been writing about our online reading behaviour for over 20 years. Trends in behaviours are evolving influenced by technological developments, but the fundamentals remain the same because we as humans aren’t evolving that fast.
Gazeplots generated from eyetracking studies show that typically, we dart about looking for the thing we think is most likely to help us towards our goal. Clearly what catches our eye depends upon the task we’re in the midst of, but that satisficing behaviour is pretty clear.
Text Scanning Patterns: Eyetracking Evidence – Nielsen Norman Group article
So far, so overwhelming. So students may become confused with our unfocused content trying to be all things to all people.
But it’s worse than that. If your reader isn’t sufficiently invested in completing a task, they are highly likely to abandon. Multiple psychological experiments have shown over many years and in a range of contexts that we are more likely to give up if presented with too many options.
This is often called choice overload or decision paralysis.
For example, in a now-famous study, people were more likely to purchase gourmet jams or chocolates when offered a limited array of 6 choices rather than a more extensive array of 24 or 30 choices.
UX Myth #12: More choices and features result in higher satisfaction – a high level overview with lots of examples including the jam experiment
So what do we do about this?
Above all: Stop signposting!
That’s easy for me to say, but how do you know when you’re doing it and what should you do instead?
(Caveat: this isn’t how the Prospective Student Web Content Team does things most of the time, but I recognise most University colleagues don’t have the expertise or remit we do. This is my ‘the least you should do’ advice).
In a nutshell:
- Be clear about the behaviour you want from someone interacting with your content
- Test your content quickly before you publish it with a 5 second test
- Ensure the content is likely to lead to student success with usability testing
- Make sure you have a means to measure engagement once it is published
- Improve your approach based on what you learn.
When you’re writing a piece of content for a web page or an email, be clear about your expectations for user behaviour. What do you want them to read, and what do you want them to do next?
Make sure that anyone who needs to review or sign off on your content is aligned with your thinking. Then it becomes less of a conversation about the words on the page, and more about whether your approach is likely to achieve the outcome you seek.
Five second tests – a quick check
When you’re clear, try your content out on a few colleagues using the 5 second test technique. This involves flashing the content up on screen or putting a sheet of paper in front of someone for 5 seconds and asking them: “What was the page about?” and “What was the most important information no the page?”. You might also ask a more specific question pertinent to the intended outcome.
If the most important goal of your content isn’t standing out and immediately clear, you need to redraft.
How to conduct a five second test – article on UIE.com
Why does this matter? Because research shows that people form an emotional response in milliseconds. They decide whether something is worth perservering with soon after that.
Web users judge sites in the blink of an eye – article on Nature.com
How long do users stay on webpages? – article by Neilsen Norman Group
Focus on student success – think beyond the click
Play out the scenario of the student opening your email. When they reach the call to action, what then? What do they need to do next? Is it obvious? Think beyond the piece of content you’re writing and design the journey through to the completion of the task.
This may involve collaboration. That’s to be expected in a big, complex organisation like a university. If you only care about the step on the journey that you’re working on, you’re investing in the output (your content) and not the outcome (student success).
This is so important. If you’re not doing invested in student successfully completing tasks, you’re just “signposting”. Sending off students to be someone else’s problem.
In an ideal world, you’d have a small number of students interact with the content and do what they thought they needed to for the scenario while you watched and didn’t help. (This is called usability testing).
In reality, we can’t always get access to students, and not all content we work on comes with a budget to pay students for their time.
But what you absolutely can do is call in a favour from a colleague who isn’t familiar with what you’re doing. Or ask a friend or your partner or your grandma to do it. A few fresh sets of eyes is all you need.
Do-it-yourself usability testing – seminar video with Steve Krug
Measure engagement and behaviour once live
You set out with some assumptions.
You did your best to eliminate the biggest risks of students not understanding or engaging with your content through some quick-and-dirty user research.
But you need to know what actually happened in the wild. When you published your content or sent that email.
The key thing here is to plan for measurement from the outset and not just think about this after it’s gone live. Make sure analytics review is baked into how you designed your content.
If it’s a website, use Google Analytics to find out things like:
- What was the average time on page? Count the words and assume people read about 200 words per minute. How long would they need to have been on the page to read every last word. (They won’t).
- What percentage of page visits resulted in readers clicking through to the thing you wanted them to?
Use other sources of insight:
- Other analytics tools are available. For example, Website and Communications can set up a short term click analysis tool for you (called Crazy Egg) which will give you extra detail on areas of the page students dwelled on and wha they clicked.
- Enquiry analytics. Is your content prompting new enquiries? Was the point of your new content to encourage interaction via email? Using web forms on your website rather than email addresses will make this a viable way to monitor behaviour.
- Trackable links. The University’s URL shortener (edin.ac) is built on Bitly which means you can learn how many people clicked a link and when they did it. This is particularly useful when sending emails, although some corporate mailer tools like Dotmailer also have their own analytics included in the platform.
Learn and improve
Finally, accept that you won’t get things 100% right and you have the opportunity to improve.
The beauty of digital design and communication is that we have unprecidented access to insight into how effective we are. Take advantage of this.
When you plan your work, factor in time to review, appraise and evolve. That evolution might be on the things you’ve just released, or it might be evolution of your practices for next time. Or ideally, both.
Share your experiences
What did you think of all this advice? Have you tried to do all this? Share your experience in the comments and if you have any questions, get in touch!
(School Lost and Confused Signpost by Wonder woman0731 CC BY 2.0 )