Copper by artefacti on Adobe Stock

Impact, what impact?

05 March, 2019

Kay Jenkinson

Reading time: 4 minutes

CREDS Knowledge Exchange Manager, Kay Jenkinson, talks about how researchers can demonstrate impact. However, drawing the lines between research outputs and a policy decision or social innovation that is distant in time and sometimes (intellectual) space is, frankly, far from straightforward.

I’ve been pleased to attend a couple of great meetings organised by CREDS’ colleagues in the past month, where stakeholders from energy policy and energy industry were present.

It was a demonstration of the hard work of the respective research teams that they collected together impressive assemblies of well-placed, informed stakeholders who found these events genuinely useful and informative.

Across the UK, programmes like CREDS are looking for ways to demonstrate the impact of research on a multitude of stakeholders, processes, norms and decisions.  However, drawing the lines between research outputs and a policy decision or social innovation that is distant in time and sometimes (intellectual) space is, frankly, far from straightforward.

People like me who are involved in knowledge exchange, monitoring and evaluation or impact assessment, are always looking for better ways to demonstrate the effect of the research we champion.

Demonstrating impact is something that the academic world is still struggling with, although more formal guidance in the form of the planned Knowledge Exchange Framework is on the horizon.

To be fair, it’s not an easy thing to do: we’re not selling hair care products or financial services. Sometimes we’re scrabbling to attach our research group to the same bit of legislation or new standard as other groups of researchers. And our often assessments have to include a rag-bag of contacts, meetings, emails and, even some best guesses (aka ‘expert opinion’).  But there’s always some relevant data to collect which can be used to support our impact claims.

In CREDS we are fortunate to have a community of researchers who are definitely up for better understanding their routes to impact, for learning from their own practice, and taking lessons from examples of what worked (and what didn’t).

To support them, we’ll be developing guidance and examples during CREDS’s five-year life, with a view to contributing more formally to the body of evidence emerging on how to deliver effective research impact.

As a taster, here are a few practical ideas based on the practice in CREDS and other research programmes I’ve observed. I’ve focused on meetings and events, and included some suggestions for each stage.

  1. At the planning stage, agree your objectives for the meeting (information sharing, information gathering, networking opportunities).
  2. Give your intended audience a really good reason to give up an afternoon of their time (first insights, new contacts, influencing research, listening to what they think).
  3. Keep track of the main interactions (who expressed interest, who attended, what was on the agenda, who contributed to the agenda) so that you can start to understand who is engaged, and where there may be gaps. Then you can take action to fill critical gaps (targeted approaches, adjusting the agenda)
  4. During and immediately after the meeting, collect data on audience feedback, interaction and opinion (emails, meeting notes, post-it notes, feedback forms, interactive surveys e.g. Sli.do  – making sure that they reflect on the original objectives for your meeting or event.
  5. After the meeting, review any feedback and reflect on what you think went well or could be improved (have you acted on the advice of participants, have new contacts emerged, has there been positive interaction on social media/your website). Share it with your team, and come up with ways to improve your next event.

With these fairly straightforward planning processes, it’s not difficult to track the contribution to your objectives, and if participants were engaged and interested. It feels good to be able to plan future activities with more confidence, backed up with your evidence, and it can be a useful way of evaluating novel approaches against more established ways of working.

If you’re interested to read more, here are a few links to more information

Banner photo credit: artefacti on Adobe Stock