Developing a Framework for Innovation Intermediation

My exciting journey with the Innovation Caucus started one rainy morning in Spring 2017, when by chance I spotted an advertisement for internship applicants doing the rounds over email. This was followed by an email from my supervisor, asking all of his PhD students if we have seen the call and whether we were interested. Not being someone who declines any opportunity, my reply was immediate – yes!

Having found out about the Innovation Caucus and its work some months previously, when putting together a notice for the departmental newsletter about our engagement with policy, I was really excited by the opportunity to further translate my research interest into useful knowledge for policy-making. Having applied and made it through to the interview, I was ecstatic! Speaking to Tim and his team was interesting and inspiring, and once I was offered the internship, it took even less time than before to say “yes” and accept it.

As I am really passionate about my PhD research topic (social aspects of technology development and innovation) and my subject matter (Space Industry – yes, the stuff “up there”) I took quite some convincing to take on new challenges within the Innovation Caucus brief. In part, this was because I really wanted to create a new space of shared knowledge and sense-making, i.e. to challenge the theoretical concepts with empirical findings and policy realities – and I could only envisage doing so within the topics about which I was already somewhat knowledgeable.

However, in discussion with Innovate UK and the Economic and Social Research Council (ESRC), I did eventually reshape my interest into developing a broader framework and typology of innovation intermediation in any geographically bound sectoral system of innovation. This was of value to Innovate UK, since it supported the ongoing development of their portfolio of Catapults and Knowledge Transform Networks, as well as other projects and policies.

This experience was a great lesson for me, not only in working with and delivering for a policy-making system, but also in expanding my own research interests into domains I did initially find uncomfortable. Presenting the headline findings of this work at one of the most prestigious innovation conferences in the world, DRUID (2018), helped me appreciate the power of broader generalisation of academic knowledge, in order to achieve more substantial societal impact.

The lessons learned and experiences from this project also enabled me to engage better with new concepts, unfamiliar settings and unknown stakeholders in my subsequent work. For instance, these skills have proved critical in working on a consultancy project for the OECD and as a Research Assistant in academia.

I have to express my big thanks to Tim and his team for their support and mentorship and to all involved with the Innovation Caucus, particularly Innovate UK and the ESRC teams involved with my internship. It was your determination and generosity that turned this project from a 3-month desk-job into a transformational professional journey.

lxvchcxo_400x400

This post has been published in October 2018 at Innovation Caucus blog: Developing a framework for innovation intermediation.

Find out more about Innovation Caucus.

“Means, Motive, Opportunity” – ER2

In order to frame this enquiry, let’s begin with a small the exploration of the motivations behind commissioning and performing the evaluations in the first place. Though examples here are from social research, these are easily compared with parallels in any intervention, including investment in the development of the science, technology (and business support facilities and services (for example STFC, 2014:5-7).

Firstly, an important part of the evaluation research is process evaluation (Rossi, 1972:34), used in order to improve on the delivery of the intervention, or – as beautifully listed in an interview with Waverley Care (a Edinburgh charity) CEO – “what we need to stop doing, what we want to keep doing and what we are not doing that we should be doing”. When working along this strand of evaluation, it is crucial that the researcher provides recommendations that can be acted upon. The best way to carry out such evaluation is often to focus on a specific small area of the intervention, for example how does an organisation collect feedback and implement changes reflecting the concerns raised by internal and external customers. Having said that, conclusions and recommendations can often be very general.

In the process evaluation, there is further check on the identifying emerging needs and (geographical, social, economic) individualisation of the delivery of outputs. This is particularly important for social projects (such as the Waverley Care), where there is significant variation across the different locales in which they work. However, this is also important in terms of social and geographical inclusiveness of science and technology investment. Hence, evaluation research in this context can provide important checks on the “fairness” of the intervention whilst it is underway.

Then there is the often missed – but in my opinion very important objective in evaluation – the inward facing component, i.e. the improvement of morale of the people engaged in the programme/intervention/organisation by celebrating their success. It is very important for the staff to appreciate the whole picture, “take a step back” to frame their work within a wider context. This is both a good motivation for future work as well as a huge morale boost as one can see how they personally and as a team are making a significant difference to people’s lives.

Finally, the primary motivation for impact evaluation is (always?), to understand the impact/difference an intervention/organisation is making. Evaluation is often considered important for funding applications, i.e. both assessing the need for the intervention as well as monitoring the delivery of outcomes (to evaluate the VALUE generated).

My research is similarly linked to the need for accountability when spending public money (Nutley, Walter and Davies, 2007:254) and in particular the effectiveness of the investment in natural sciences research (mainly cost benefit analysis), which is currently epitomised in cost benefit analysis, but that is already the topic of the next post…