Evaluation in government: weaknesses in the learning cycle

We’ve all heard too many times ‘you can’t manage what you don’t measure’. The need for learning in policy, to understand what impact different interventions have in different contexts, has become a really hot topic. This is one element of the evidence-based policy movement.

However, one of the most recent reports from the National Audit Office (NAO) is a cause for concern. The report entitled Evaluation in Government was released before Christmas of last year and so may not be getting the attention it deserves. It outlines the quantity and quality of evaluation work that is carried out across central government and the numbers don’t look good. For example, according to the report there are plans to evaluate only £90 billion out of £156 billion planned spending on major projects. Of 261 impact assessments reviewed only 40 made any explicit reference to evaluation evidence, showing the weakness of the learning cycle across government.

Beyond those headline numbers there is a more subtle but possibly more worrying trend. It appears that the studies that are of lower quality make stronger claims. Using the Maryland Scale, 34 evaluations across 4 policy areas were assessed for the quality of the evidence base and how strong the claims within the evaluation were. Below is the figure that summarises the findings (figure 11 from the NAO report).

NAO_figure11

Granted this is 34 evaluations, so still a relatively small number, but the top left grouping in the figure is worrying – 15 out of 34 evaluation reports have very strong claims with no or very weak caveats. Policy interventions are being assessed as highly effective based on very weak foundations. Policies that do not have a strong evidence base are potentially being reviewed favourably and becoming part of the policy landscape.

Reading the report reinforces the need for courses such as our MPP and others across the UK, which are working hard to increase the quality and use of evidence throughout the policy process. Within the Cambridge MPP we are very grateful to RAND Europe for working with us to deliver an intensive workshop on methods in evaluation. Having expert practitioners involved in these practical elements of the course is very powerful and ensures students are given the most up to date perspectives on tools and techniques that will be of value to them as they progress in their careers.

However, this is a problem which needs to be brought into the light so that we can have more effective evaluation, used more often and a step change in the learning cycle for policy in the UK. Policy cannot be left to who has the best narrative or media strategy. We need to generate clearer evidence of what we are doing even if it is incredibly difficult and currently undervalued.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s