Content Audit Findings and the 100k Challenge
If you are involved in managing or updating content for a university website, we need your help.
In order to provide a better online experience for everyone at the University of Edinburgh, we’re calling time on out-of-date and low quality content across all of our websites.
Our content urgently needs some care and attention
In 2020, following a tender process, we commissioned the digital marketing agency Fresh Egg to undertake a content audit of the entire University estate.
Conducting the content audit itself was no small task. Our estate consists of over 1,500 websites, managed separately across 400+ business areas. Whilst the majority (around 70%) are on EdWeb, the remainder span a variety of platforms; Drupal, WordPress and others.
Whilst we couldn’t expect Fresh Egg to manually review the quality of such a high volume of content, they were able to identify a number of issues that are common across the estate, including specific technical, SEO, and editorial issues that need to be addressed.
The agency used existing data from Little Forest (which holds a repository of all websites in our estate) as well as other sources, to assess sites at either end of the scale – the largest, highest traffic sites and the smallest. In total, 200,679 URLs of the 341,830 in Little Forest’s database were assessed.
The 100k Content-Pruning Challenge
Most strikingly, Fresh Egg’s report recommends 100,000 URLs for removal across the estate.
We need site owners and content editors across the university to collectively step up to the challenge and help us ensure these 100k web pages are removed, unpublished or archived during the remainder of 2021, as well as ensuring Fresh Egg’s other recommendations are carried out.
How to use the content audit to make improvements
Fresh Egg have provided us with a wealth of data to help improve our content.
Google Data Studio Dashboard
The agency used Google BigQuery to combine data points from Little Forest, Google Analytics and Search Console in Data Studio.
How to use the dashboard:
- You can request access via website.support@ed.ac.uk
- Custom tables have been created for the main issues discovered in the audit, for example pages with missing titles, headers or metadata, and pages with multiple PDFs.
- Each page has contextual guidance explaining how to use the tables.
- You can drill down to see your web pages within EdWeb or other large sites.
- You can also create your own tables, in order to drill down into the data across 50 different metrics. To do this, you need to request edit access.
Spreadsheets with specific recommendations
Fresh Egg also exported site data into 89 audit spreadsheets that contain important recommendations for every site owner.
The agency added data from Screaming Frog and Botify (web crawling tools with a focus on SEO) to understand sites’ technical performance. They then manually reviewed websites for which little or no data was available, and analysed the content within them to identify what content needed to be kept, removed or improved.
Local site owners and content editors are best placed to make final decisions on what should be removed, improved, or archived. However, we do ask you to take these recommendations very seriously.
The guidelines below on archiving and content retention may also be helpful:
- Guidance on archiving EdWeb content
- General retention guidelines for content from Records Management
Fine-tuning the data
The data source for both Google Data Studio and the spreadsheets is based on a snapshot taken in August 2020 of sites across the university estate. Therefore, it is not a replacement for the live data contained in Little Forest or Google Analytics.
There are some other known errors affecting the data that Fresh Egg highlighted in their report, for example node URLs created automatically by Drupal and historic issues with redirects. We are investigating these issues and would like to hear if you find any additional inconsistencies in the data.
The audit is by nature a work in progress and community feedback will be very important to help us fine-tune the results.
Quality content is critical
Improving online experiences is key to effective digital transformation for the university. And well-structured, concise, user-centred content is just as important as technology in achieving this.
When it comes to creating and maintaining quality website content, the principles outlined in Neil Allison’s classic blog post still hold true:
Whether your site is on EdWeb or a different platform, it is representing the University of Edinburgh across the world. Every choice you make about what content to publish (or remove/archive) will impact the global reputation of the university.
Caring for your content – next steps
- The findings and outputs of this audit will feed into migration planning for the new Web Publishing Platform as well as the longer-term website publishing guidance we issue.
- Content needs to be continually updated and refreshed – it’s not just a one-time thing. If you are on EdWeb, please don’t wait for the content migration to the new platform to begin before you get your content in ship-shape condition.
- We will continue to provide updates and guidance on best practice on how to care for your content, as well as timings around content migration.
Finally, we know that some site owners are already making great progress with improving/removing content already! Please track what progress you are making and share it with the web and communications team via website.support@ed.ac.uk
We’d love to hear your success stories and we are very keen for site owners to share their experience and expertise at our monthly web publishing community sessions, so please do get in touch to let us know how you’re getting on.
1 replies to “Content Audit Findings and the 100k Challenge”