Any views expressed within media held on this service are those of the contributors, should not be taken as approved or endorsed by the University, and do not necessarily reflect the views of the University in respect of any particular issue.

User Experience briefing to Digital Transformation Board

This week I gave an update on the pilot User Experience Services to the board overseeing all areas involved in the Digital Transformation portfolio. In this post, I’m sharing the slides, plus a transcript of what I talked through.

I talk about user experience practices and case studies quite a lot, but rarely with a senior management audience. My aim in this session was to update on what we’re doing and why, but also to highlight how adopting a more user centred culture presents challenges and opportunities to digital leadership.

The slides

User Experience Services update – Digital Transformation Initiative Board – University of Edinburgh from Neil Allison


In this session I’m going to give you an overview of the the programme of projects collectively titled User Experience Services.

But before I do that I need to make sure we have a common level of understanding. What is User Experience Design – commonly abbreviated as UX – and what does it mean for our business, how we run projects, and more fundamentally how we decide the best ways to enhance the student digital experience?

What is user experience?

This circle model from Nielsen Norman Group provides a succinct summary of user experience, by outlining what makes a good user experience:

  • It starts by being useful to the intended audience, meeting their needs. It has utility
  • It’s something that the intended audience can use easily, in specific contexts of use. It’s usable.
  • It looks and feels pleasing to the intended audience. It’s desirable.

The key point being made here is that user experience is a major element of the brand.

If we think for a moment about a priority area of business for the University – online distance learning – the experience of these students is almost exclusively digital. Their experience of our digital platforms and services pretty much is their experience of the University of Edinburgh.

But note that this isn’t necessarily a about digital experience, although a digital element is present in almost all experiences these days. It’s about users, and how they interact with a business, its service and products.

The discipline of user experience research and design has grown and matured significantly over the past 10-15 years due to the rapid change in technology, and with it business opportunities and user expectations. Most fundamentally, the opportunity to engage with and monitor user interaction is so much more feasible.

The experience is the brand

Let’s think for a moment about some organisations with a strong brand reputation built on user experience. I’ve deliberately chosen these examples (Amazon, Apple, Ikea, McDonalds) to make the point that some great user experiences are primarily offline, with a focus on service process and human-to-human interaction.

What I’d like to do is briefly walk you backwards through a process to highlight how great user experiences are facilitated. When you encounter a great product or service, it hasn’t come into existence through the work of a genius designer, or through use of a particular technology.

Great experiences are
•…enabled by the organisational culture
•…generated by process
•…continuously improving

Standing still isn’t an option. Our audiences continue to adopt new behaviours and attitudes driven in part by evolving technology, regardless of whether we continue to evolve. Digital transformation isn’t a finite process, regardless of the duration of our current projects.

So let’s work backwards…

The Human Centred Design process

The product you use today won’t be the same in a few days or weeks or months. User expectations heighten. What delighted you about a product in the past is now a basic expectation. For example, self-parking car features are currently present only in high end models, but in a few years will likely become an expectation of standard models. Later still, the absence of such a feature will be a significant disappointment to customers.

But before the launch, as the product was being developed, testing with representative users will have been happening. Perhaps there was a limited release to a segment of users, perhaps A/B testing. Regular user interaction insight will have been shaping the development team’s priorities – to develop additional features or to refine and improve what is currently being tested? Prioritisation will have been happening throughout the delivery phase. This has to be done swiftly, responsively, based on the insight being gathered. The project team needs to do that. The user experience function facilitates it.

“Features, on their own, are not a measure of success”
– Jeff Gothelf @jboogie

We talk about delivering Minimal Viable Products these days, but in reality the business can only dictate what the minimum delivery is. Whether it’s viable is decided by the customer.

But before the team were developing the product they intended to release,  how did they know what to build? It’s a very expensive business to build things that people don’t want to use. The team will have prototyped key concepts and ideas in their anticipated solution to ensure the ideas that worked best were the ones that progressed into production.

“It costs a lot more to build the wrong thing than it does to do experimentation and research”
– Melissa Perri @lissijean

But before they were testing prototype solution ideas, how did they understand the problem they were solving? Whose needs were being served? Whose problems were being solved? The users’ or the business? And where were these perceived needs coming from? The team will have been involved in a process of discovery; interacting with users or potential users to understand their needs, attitudes and behaviours and to synthesise these into an expression of a problem that they believed the business was in a position to potentially solve.

A design thinking, customer centric organisation, is cross functional in its approach. It involves the end user throughout its process.

“Strategy = who is your audience, what problem you solve for them and what makes you uniquely suited to that.
The rest is tactics and execution”
– Kim Goodwin @kimgoodwin

A question to ask ourselves – are we features and solutions focused in our approach to project initiation, or are we needs and outcomes focussed?

Evolving our goals for digital transformation

I’d like to look briefly at our goals for digital transformation. Specifically Gavin’s statement of ‘What digital transformation looks like at the University’. And I’d like to propose an update.

The University’s goals for digital transformation – Digital Transformation Initiative website

I believe it’s not enough for project teams to undertake user experience research and design techniques:

  • At a service management level, we need to be designing with the user – particularly our students – at the heart of it.
  • At a strategic level, we need to adopt design thinking techniques. The way we conceptualise and commission projects needs to evolve.

My first suggested edit is to include ahead of everything else: Every leader is a digital leader.

And when we make decisions, I think we already do so with the best of intentions, using the evidence available. But that’s not enough. We don’t gather enough insight at any stage of our projects at the moment.

Think of every decision around a requirement as a hypothesis. We initiate projects, we provide direction because we believe it will contribute to an outcome. We’re saying that we believe an outcome will be achieved if a user can achieve their goal with a particular feature.

So my second suggested edit: instead of saying that we will make our decisions with the best evidence available, we acknowledge that every decision is a hypothesis to validate.

Requirements are hypotheses – my previous blog post

Of course there are limits on the amount of research we can do. But we can curate our insight better for the benefit of future projects. And we can encourage greater openness about risk and our domain knowledge as a means to prioritise our research, essentially our evidence gathering effort.

What I’m saying with both these points, in essence, is that instead of providing feedback on deliverables so much, I suggest project and service governance functions should be challenging project ad service leaders in other ways.

Here are just a few examples of the kinds of questions a governance board should be asking at various stages of a project:

  • During discovery
    • “What have you done to build a deep understanding of our customer? Do you understand their fundamental goals, behaviours, and motivations? How?”
    • “How will you keep these goals in mind during the product development process?”
  • During design
    • “How did you generate ideas to respond to our customer’s goals?”
    • “In this process, did you include a diverse group of stakeholders – including technical, marketing, sales, customer service or other departments who may bring unique and valuable perspectives to the discussion?”
  • During delivery
    • “How might we try out your team’s best ideas to determine if we’re on the right track, before investing a lot in development or deployment?”

Design Thinking for CEOs – this article on is the source of these questions

Examples of our recent work with Helpline

I was asked to include examples of recent user experience research and development work in this session. And I want to use this opportunity to challenge a common misconception that user experience work is an interface design process.

Just to illustrate my points, I’d like to give you a couple of examples from some ongoing work where we’re collaborating with colleagues in Helpline, part of Information Services.

The outcome we working towards is greater levels of student self service with IT problems, which in turn reduce our issue management overheads. In this project, we didn’t set out to deliver any specific content or functionality. We are focusing our attention on solving the problem: trying things out on a small scale as quickly and cheaply as possible, then ramping up the ideas that seem to be working well and backing out the ones that don’t. We are engaged in an ongoing cycle of research, analysis and development and now in our fourth cycle we’re really beginning to see results.

A common misconception I encounter quite a lot is that our goal is to make the whole user journey look the same. That is part of the story, but it would be a mistake to conflate following design principles with delivering a good user experience.

Here’s a nasty experience we encountered in our service provision review, and backed up with usability testing. Leaving the website to complete a support form in our call management system interface caused students to abandon and use phone or email instead. Not great for them, and more costly for Helpline. So here a new design contributed to a solution, alongside re-thinking the service offering, adjusting the information architecture, rewriting copy and implementing some new technology.

Whereas in the next two examples, the use of EdWeb and the EdGEL means the interface design is identical. But changes in content and structure have fundamentally altered students’ interaction when attempting to self-serve. The changes are so unremarkable to the casual eye, I’ve marked the screen grabs as ‘before’ and ‘after’.

So you can see that a consistent design is not the only goal of improvement. In fact, it’s perfectly possible to have a terrible user experience that is consistently presented; accessible, responsive and ‘on-brand’.

This leads me to talk about the projects we’re currently undertaking…

Digital Transformation Initiative project within the UX Services Programme

User Experience Service pilot

Helping projects to adopt a new user centred methodology to deliver better products and services

  • A design process ensuring user insight at all project stages
  • Training to empower staff & demonstrate the value of new process
  • Consultancy to advise projects or directly deliver UX research & design services

EdGEL as a service and technology delivery proposals

Providing a common experience language, evolving continuously based on business need, user insight & changes in technology

  • A repository of guidance for all elements of an interaction experience
  • Accelerating development & ensure legislative compliance
  • A platform enabling collaboration and innovation among developers & designers
  • A service supporting engagement and evolution
  • A means to appraise digital products

Student digital experience standards

Supporting adoption of user centred tools & processes, providing new insight into project governance

  • A process & service measuring how a project or service is set up and run
  • Built upon a commonly understood set of principles
  • A governance mechanism ensuring project teams
  • Utilise the right skills & roles, working collaboratively,
  • Build EdGEL-compliant products, informed by user insight

More about the User Experience Programme projects on the Digital Transformation website 

The UX Service could and potentially will exist in isolation. But these areas of activity that will ideally become three interdependent services  because they are greater than the sum of their parts.

Working with our newly arrived Head of Web Strategy & Technologies, these strands are fundamental to his planned approach.

The UX Service pilot is funded for 3 years, and this will enable us to support a great deal of activity in a range of projects under the Digital Transformation and Service Excellence Programme Initiatives while evolving the offering, enhancing our understanding of what the University needs, selling the value and ultimately transitioning to a business as usual service.

Whereas activity on EdGEL and Student Digital Experience Standards this year will deliver service proposals and where possible demonstrate their value on the UX Service projects. But as it stands, that’s as far as we can take them. We will work with colleagues in the coming months to build the case for transition into pilot services.

While it’s true to say that a UX Service can exist in its own right, without EdGEL we don’t have an easy-to-access and use repository of design elements that are built based on evidence of user need and interaction. And without the standards service, the UX research and design methodology will remain a nice-to-have, optional extra.

The Venn diagram illustrates the relationships between the three services.

Towards experience management maturity

My expectation through the piloting and ultimately embedding of UX Services would be to see a significant step forward in the maturity of the University culturally with respect to user experience. There are many ways to express an organisation’s maturity with regard to user experience culture. I am using this particular model as I think it gives a outline of the indicators, attitudes and behaviours that will help us establish how far we will have come.

Questions, comments?

I’d love to hear what you think of our work, and the challenges I pose in this slide deck.

Leave a comment, or contact me directly.

Neil Allison’s contact details and staff profile


1 reply to “User Experience briefing to Digital Transformation Board”

  1. Lauren Johnston-Smith says:

    Really like the addition of point zero, we definitely need key leaders within the Uni to be ambassadors for this shift. However, not all leaders are digitally engaged. How do we bring them all to the same level? I often marvel at how many of our ‘systems’ and ways of working (for staff) are still quite old fashioned e.g. we’ve only just moved to digital payslips! Perhaps if the staff ‘end user’ experience is a more digital one then that might make us all better ambassadors for digital and help us think ‘outside-in’.

Leave a reply

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>


Report this page

To report inappropriate content on this page, please use the form below. Upon receiving your report, we will be in touch as per the Take Down Policy of the service.

Please note that personal data collected through this form is used and stored for the purposes of processing this report and communication with you.

If you are unable to report a concern about content via this form please contact the Service Owner.

Please enter an email address you wish to be contacted on. Please describe the unacceptable content in sufficient detail to allow us to locate it, and why you consider it to be unacceptable.
By submitting this report, you accept that it is accurate and that fraudulent or nuisance complaints may result in action by the University.