Student Counselling Service – project summary
I’ve been blogging a lot about our work with Student Counselling – now read how the big picture fits together.
We’ve just completed a big project for the Student Counselling Service – overhauling their site from scratch to improve the student experience on the site and make things easier to manage for the service.
So how exactly did we do this, and what did we learn along the way?
Workshop
As with all our projects, we kicked things off with a workshop where we brought all the relevant stakeholders together to collaboratively work through why the service exists, and how the website can best support that work, asking questions like:
- Who are you trying to reach through digital content?
- What are these users’ needs and goals?
- Where is your time being used up unnecessarily answering queries that could be better answered online?
The number of people attending these initial workshops really depends on the unit and how its run. Whereas we’ll often see attendees in the double figures, only three people participated in Counselling’s workshop. The key thing is that these were the right three people, with no key figures missing.
Agree objectives
After the workshop, we went away and worked up a document detailing the site’s objectives. If the workshop has done its job, this step should theoretically be a straightforward write up of the discussion.
However, it’s often the case that when a unit sees these laid out in black and white, there are small changes they need to make. There’s nothing wrong with this, as this often reflects a clarification in thinking. It’s not that what was said in the workshop wasn’t true, or that the notes were wrong, but that, on getting some distance from what was discussed, staff can realise that actually, a certain audience is more important than another, or the website isn’t the best place to achieve a certain task.
Process, not project
This underlines the point that the workshop is as much about process than product – we could perhaps have achieved a document not too dissimilar from the final version by scribbling on the back of an envelope over coffee, but it would have been less robust, meaning:
- problems at the content stage are much more likely, as subtleties in objectives start to make themselves known
- the ongoing editing of the site is not cohesive, as the editors have not been part of the process
- the final content, even if signed off, may not meet users’ needs as thoroughly.
It’s an area where ‘broadly right’ can be ‘significantly wrong’ – the devil is in the detail.
Content analysis
Once we’re clear on what the site is meant to do, we move on to looking at how well its achieving that right now – there’s no point in throwing the baby out with the bathwater.
In Counselling’s case, this meant reviewing the content using three main tools:
- Google Analytics
- Sitebeam
- manual checks
This review looked at broad findings – such as spelling errors and general popularity of the site, for benchmarking – as well as specific measures against the objectives. For example, Counselling wanted students to engage with the referral form, understanding the process, as well as having clear access to self help resources. What could the data suggest about this?
As is always the case, we came upon several instances where the data is not enough. It can suggest likelihoods, but can never give a clear answer. I wrote three blogs over the course of this project, going into some of the detail on this.
Content review – Why automated tools only take you so far
Google Analytics for the Counselling Service – the limits of data
Sitebeam reports – combining data with insight
In an ideal world, we’d move on to fairly comprehensive user testing to validate these findings. However, we were limited by time, and the students were all away, so we decided to make changes based on the clear findings for now. Some of these changes would involve restructuring the content in such a way as to be able to much better track user journeys in the future.
Red routes
Once we were clear on the objectives, and how they could be better met, we needed to be sure of the content we were putting in. At this stage, we decided we needed another short workshop to clarify some of the ‘red routes’ through their site, and what the ideal user journey was for some typical users.
Student Counselling- What makes a workshop? – read more about this short workshop
How to get a grip of your website (and then keep hold) – more on red route usability
Content and structure edit
Now we had clear site objectives, and ratified information on how those objectives should be met, we moved on to what many might think of the main part of the work on this kind of project – editing and structuring the web content.
What we do when we edit web content
As part of this editing work, I used MindGenius for the first time to show comparative web structures.
Using MindGenius to map out web structures
Build
Once the content’s signed off, we built the site – the ‘big finish’ that gets all the attention but takes the least amount of work.
The site goes for sign off back with the unit, but if the job’s been done right so far, this is little more than a courtesy. The structure and content, built in the first place with the site objectives at the core, have already been signed off, so the journey from build to publish need hardly take any time at all.
Review document
To declare the project ‘finished’ after building the website would, of course, be to undermine one of our key mantras.
A website is never finished.
However, thanks to everyone’s hard work and co-operation, the new site is much more measurable. The structure is now much simpler, and there’s tracking on some key goals. This meant that our final deliverable on this project was a three page document listing key things to check and measure as part of a review.
The metrics in this document are wide-ranging, including:
- How many people are clicking on their key call to action (now a trackable link) and how does this compare to overall visitors?
- Where are visitors to certain pages based, compared to the spread at the last review?
- Have editing staff felt their time is being called upon less, or at least better used?
Although this last one is subjective, it can be useful to add colour to a more numbers-based report.
We’ve recommended doing this review every six months, but even if it’s an annual review, it’ll give great insight into how the site is performing against the service’s main objectives (better, we would hope!).
These regular reviews avoid the boom-and-bust cycle so many websites get into, so the benefits of the work we’ve done on this project should be felt for many years to come.
Interested in our process?
If you want to know more about making measurable improvements to support your site objectives, get in touch to discuss how we might help.