(Re)learning Google Analytics 4: February’s Edinburgh Public Sector Digital Meetup
Love or hate it, following the retirement of Universal Analytics in July 2024, Google Analytics 4 is now the main space for (relatively) accessible analytics information for many websites, including the University of Edinburgh. I’ve not had much time to dig into our analytics property in my current role, so for something of a refresher, I recently attended a session hosted at the Scottish Government with Jono Ellis, Duncan MacGruer and Hazel Cargil. Here are my top takeaways from the session.
Google Analytics: kind of like learning to ride a bike
My last deep dive into GA4 was back in 2022, so naturally I assumed I’d have forgotten everything by now. However, I was pleasantly surprised by how much came back to me.
Provided you go deep enough the first time to grasp the basic principles, it’s unlikely you’ll forget everything even if you haven’t used it for a while.
By basic principles, I mean:
- Triggers
- Events
- Tools: Tag Manager, Analytics and Looker Studio (and how they interact)
I used Google Academy the first time round, and during the session it was name-dropped again as a good place to start upskilling in Analytics. Plus, it’s free!
Google’s data analytics courses
Looker Studio offers a nice introductory route to your data
In my last role, I used to disseminate static excel or PDF reports of Universal Analytics trends every quarter. I then ‘upgraded’ to Google Data Studio which was a clear move towards more interactive reporting, where you set up the sorts of things you wanted to track and just had to specify the filters to spot the changing trends.
In 2022 Data Studio was rebranded as Looker Studio, which is billed as a quick way to visualise and share reports on your analytics data.
Our analyst Carla built a Looker studio for the team before she went on secondment, and it has been a very useful tool for us to dip into for ‘quick’ trends and patterns of use.
The presenters from Scottish Government reiterated how handy this tool is, particularly as you can set it up to let people ‘self-serve’ by adding filter options for dates, subsites and so on. Alternatively, by ‘forcing’ particular ranges or sections of data in the background when you set it up, you can build a more stable report if say you want to know the same sorts of trends each time you review it.
With a little bit of short-term work, and provided you know the types of trends your audience is looking for, it can be a very useful ‘one and done’ report setup that you can easily share with your colleagues to share analytics data.
Everyone’s a bit lost, but some people already have a map
When GA4 replaced Universal Analytics, everyone had to start over with the new approach to measuring and reporting on data. Even one year on since Universal Analytics retired, there’s still a lot of uncertainty over how to find and interpret the results.
However, rather than start from scratch, reach out to your network to see how others are using it. While you won’t be able to replicate their setup exactly, you might get some good ideas about what to measure and how to manage it. Looker is just one example and might suffice for certain audiences, but for a lot of things you’ll still need to dig around in GA4’s Exploration and Reports.
During the session on Tuesday, the presenters covered a few common tasks, including:
- measuring where people come in to your site, how they got there, and where they go next
- on-page interactions like clicks, scrolls, interacting with dropdowns, accordions, and search filters
They also spoke about a few typical ‘quirks’ in the data. One is the ever-present ‘not set’ (caused by someone loading then closing a page very quickly, or loading something like a PDF). Another is the fact that Exits information (where someone left a site) is still not in Looker Studio yet, so you need to use GA4’s Explorations to puzzle that one out.
There’s always a catch: declining acceptance of cookies and caveating your data
One big discussion point during the Q&A was the downward trend in people accepting cookies. Due to privacy legislation, analytics cookies need to be ‘opt in’ and even the old trick of ‘the accept button is more attractive than the decline button’ is steadily being phased out.
The presenters addressed this by pointing out a few important points:
- Use a tool to gauge a ‘baseline’ figure for what percentage of people are accepting analytics, like server logs or a hit counter – on different sites, the Scottish Government team are seeing between a third and a half of people accepting cookies.
- Cookie acceptance is linked to whether people think you’re using cookies for marketing purposes or functional ones – they’re less likely to say yes if they think it’s just going to be used for advertising.
- Caveat your analytics: not just for your external audience, but also for an internal audience looking for information. Emphasise that it’s for spotting patterns, but needs to be used in tandem with other more qualitative data when decision-making.
- Analytics is just one tool in your user research toolkit; it can demonstrate general trends, but is less effective than actually watching users interact with your sites. It’s a blunt instrument for showing overall patterns of behaviour, but won’t necessarily tell you the effectiveness (or not) of your content.
I found the last point most important. We’re often asked questions about the analytics for [x] page or [y] section, but without knowing what the problem is we’re trying to solve, data on its own won’t tell you anything.
If you’re trying to work out if a form works for users, the raw numbers of submissions versus form abandonment are not sufficient – you need to consider this alongside, say:
- requests for support from people trying to use it
- the path they took to get there (are they getting to it unexpectedly? Are they coming via a directed route like an email link?)
- the number of erroneous submissions to the form (people that filled it out but shouldn’t have)
Scottish government caveat on analytics reporting
What I’ll be doing next
I’ve already put a little of what I learned to use by doing some digging in our analytics property. We’ve also done a final check of our tagging for our new degree finder (launching Monday). Plus, we identified some areas we could work on to improve our reporting and get the right triggers in to GA4 to reveal more about how people use the interactive features on our pages.
I was also very interested in the caveats that they publish on their analytics data and reporting. I’ll be picking this up with our analyst and other teams. I’m also thinking of a go-to caveat I attach to any responses to ‘can I get some analytics on…’ questions!
I’ll also be doing a little digging to see about establishing a baseline of the percentage of users who are opting in to analytics cookies.
Thanks again to the presenters and organisers of the session!