The role of official statistics in evaluation | Insight project

Insights into the use of official statistics in policy evaluation

Grace Pitkethly, Insights and Evaluation Manager at OSR writes about how the use of official statistics can improve the already essential tool of evaluation to ensure the effective functioning of government.

Evaluation is an essential tool to ensure the effective functioning of government. In the words of the Evaluation Task Force: “Government departments are expected to undertake comprehensive, robust and proportionate evaluation of their policy interventions in order to understand how government policies are working and to ensure the best value of public money”.  

In recent years there’s been good progress in setting up structures and providing guidance from the top-down to help departments conduct good quality policy evaluations. We fully support this at OSR – our Director General, Ed has written about how good evaluation supports (and is supported by) the Code of Practice for Statistics.  

At OSR, we also want to help from the bottom-up – enabling the people conducting evaluations to do this as effectively as possible using statistics and data that serve the public good. Supporting policy evaluation at different levels, whether cross-government, departmental, or team-level, helps enable efficient and good quality evaluations. One way that we can do this is supporting the use of official statistics in policy evaluations, focusing on the value of the statistics to support society’s need for information from evaluations. NAO guidance based on the Magenta Book says that existing data from administrative and monitoring systems, or large-scale, long-term surveys should be considered first as data sources. But is that actually the case? 

We carried out a quick exploration of how official statistics and their underlying datasets are currently used in evaluations and how OSR can support statistics producers to make their statistics more valuable for evaluations. Through our conversations, we heard about a variety monitoring and evaluation programmes which draw on official statistics, in spite of limiting our scope due to time and resource constraints. 

Crucially, we did not identify any evaluations which rely significantly on published official statistics alone. This doesn’t mean that examples don’t exist – but none were raised after speaking to five OSR regulators, individuals in eight policy departments involved in carrying out or enabling evaluation, and five other teams across ONS and Cabinet Office. 

We found that the most common way official statistics are used in evaluation is through the analysis and linkage of data which underpin official statistics. In some cases, the data which underpin official statistics are linked to, or analysed alongside, primary data collections designed specifically for the evaluation. This is to overcome barriers such as data gaps (where official statistics are not produced for all outcomes of interest) and granularity (where official statistics do not break down into the geography or group of interest). One example is DLUHC’s Supporting Families programme evaluation. This linked together existing data sources from multiple departments (many of which feed into official statistics) and Local Authority data, with additional primary data collection. 

However, our conversations highlighted other potential barriers to linking data in this way like inability to access data held by other departments securely, difficulty cultivating relationships within and across departments to get buy-in and data matching issues arising from lack of harmonisation. These barriers are not solely at departmental level but also individuals conducting or involved in evaluations. This shows the importance of combining top-down cross-government evaluation guidance with a bottom-up approach, starting directly with the people producing and using the data, to create the right conditions for successful evaluations. 

Although these are high-level findings, they highlight key questions that OSR can explore to support the use of official statistics in evaluation:  

  • Do statisticians consider key policy questions and the data needs of evaluation when developing statistical outputs? And do OSR regulators support them to do this? 
  • Are the data suitable for linking to other datasets? 
  • Is there effective analytical leadership in place to support finding, accessing and sharing official statistics for evaluation purposes? 

These are just some of the areas that we will explore arising from this work. What’s certain is that evaluation is only growing in importance and visibility across government and OSR can play a role in its success.

Insight: seeing the bigger statistical picture

Insight sounds self-explanatory…

Many of us have a clear idea of what we mean when we refer to insight – for example, having a good understanding of a potentially complicated issue. However, gathering organisational insight is not a clear, one-size-fits-all approach. What insight looks like for your organisation can vary wildly depending on factors like your purpose, size and target audience.

Insight is invaluable for any organisation – identifying trends across your work programme, encouraging big-picture thinking from your team and even helping shape your strategy. An analogy for this could be a dot-to-dot picture (bear with me). You know there will be a clear picture after connecting the dots but can’t quite see what it is yet. But these hypothetical dots aren’t numbered so there are multiple ways to connect them which you must test in order to uncover the final masterpiece. This flexibility is exciting and gives lots of room for innovation but can also seem daunting.

How is OSR approaching insight?

This is where my new role as Insight and Evaluation Manager comes in. When I joined OSR four months ago, I was impressed at the quality and breadth of our work programme for such a small team. I saw immediately that team members were always encouraged to get involved in work outside their area where possible. There was a clear focus on the importance of highlighting cross-cutting themes and building insight across the UK statistical system, including the state of the statistical system report, data gaps and promoting greater coherence and transparency in the communication of statistics. By bringing these outputs together we can build something bigger and interconnected – how we do this is what makes up a large part of my role.

Before I joined OSR, as part of our insight development we had conversations with other organisations to learn about how they define and gather insight. Thank you to those who took the time to share their experiences. Now we are using what we learned through that work to shape our insight programme around the following themes:

Information

Our biggest aim for this year is to develop a more sophisticated use of evidence. We will identify what data we need to gather at different project stages and capture this in a way that can be easily restructured, analysed and visualised to show key themes.

Audience

Once we have strengthened our internal capability, we can be in a stronger position to build quickly and share insights across the statistics system. The greatest value lies in this next step – providing the big picture to help producers learn from each other. We must communicate effectively with our audience through a variety of channels, share good practice and areas of concern, and listen to their needs.

Embedding

Both themes above depend on an embedded approach to insight across OSR, both in our mentality and processes. Because we are a small team, it’s down to everyone to play their part. If we all see insight as inherent in our work, not a separate afterthought, it is much easier to build a self-sustaining system to generate insight. This is the ultimate goal!

 

We may be small, but we can provide powerful insights which impact across the whole UK statistical system and beyond. I’m excited about the year ahead and will see you in the summer for the state of the statistical system report 2021/22.


If you would like to speak further, feel free to contact Grace Pitkethly.