Skip to main content

Impact of Research masterclass #1: Sarah Coombs

By 6 May 2021December 6th, 2023Blog, Masterclass

In the Impact of Research masterclass series, the first meeting was hosted on 29 April 2021 by Sarah Coombs, PhD student at Leiden University and working at Saxion universities of applied sciences. Coombs is working on a study on measuring the social impact of research by universities of applied sciences, and partly in light of her position as Research Support Advisor at Saxion, this made her a prime candidate to spearhead the series of masterclasses.

Measuring social impact is complicated. The BKO contains a number of indicators designed to make impact measurable, but these are so ‘soft’ that rigorous evaluation of social impact is still problematic. Coombs’ study aims to generate a new model, a new framework, which will facilitate measuring social impact earlier as well as more effectively: i.e. not only after the process but also while conducting research.

Of particular importance here is devising metrics to measure social impact. Research at universities of applied sciences is pre-eminently aimed at answering social issues. For researchers at these institutions who actually create products as outputs of research and not just manuscripts, it is important to ascertain whether the product is actually used. Coombs emphasises that her model, or framework, is inspired by previous attempts to understand impact, such as Contribution Mapping by Kok en Schuit, the PIPA model by Van Drooge en Spaapen, and the Dublin Research Impact Toolkit. In essence, her framework entails judiciously tying together early planning on impact with stakeholders, arriving at indicators that can be measured relatively easily and quickly, gradually monitoring this impact and gathering evidence for impact along the way, to be able in hindsight to communicate better and more unambiguously about what the research has achieved.

The discussion following the presentation illustrates that this does not solve all questions relating to the social impact of research at universities of applied sciences. Thus, there is still room to choose in the breadth of impact, ranging from project-level impact to the level of research lines, or coherent research lines in Centres of Expertise.

Another angle is the measurability of outputs. Many methods currently in use have relatively simple indicators, but these mainly concern the input side (e.g. how much time was put into a project) and much less the output side. The latter is all the more complicated because not all impact of a research process leads to directly measurable outputs, but also to changes in work processes of organisations or companies, or other process impact that is difficult to attach a number to.

Existing ways of defining impact could, where possible, be used. NWO, for example, has set about making impact measurable with its Theory of Change and the impact pathways used in it. When asked, Coombs indicated that this might be a useful method, although again, it is easier to apply to typical research university output than to output by universities of applied sciences.

A related issue is how to aggregate the individual outputs of a collection of projects in a given field into outputs of the parent research line. Simply stacking the individual components does not always lead to a coherent whole in a straightforward way.

Finally, the issue is raised that tracking impact often takes time – time researchers in the project basically do not have or have not co-budgeted for. Coombs points out that starting to track expected indicators of impact in good time can prevent major efforts at the end of a project, and thus reduce the total time spent on evaluating impact.

Workshop participants further agreed that the discussion should not be obscured by the various impact labels doing the rounds, such as ‘continuous effects’ (impact for universities of applied sciences), relevance, valorisation and knowledge utilisation, to mention a few. Although they may differ at the level of detail, they should not divert attention from the bigger issue.

Article by Sarah Coombs: Towards Evaluating the Research Impact made by Universities of Applied Sciences (2021).

Esther Tielen