Trust in Statistics: Launching the Refreshed Code of Practice

OSR asked CEO of the Royal Statistical Society(RSS), Dr Sarah Cumbers, to reflect on the refreshed Code of Practice in a guest blog. The RSS works closely to support OSR’s regulatory work, including partnering with us on the TQV (Trustworthiness, Quality and Value) annual award for voluntary application, that celebrates organisations adopting these principles in their daily practice for everyone to see.

The Code is the essential foundation for our statistical system, underpinning the public trust society relies on by clearly communicating what matters when working with data, and driving up standards.

The RSS has been closely involved in the Code’s review process. We were pleased to host two dedicated roundtables with the OSR, one in November 2023 and another in late 2024, to give our members an opportunity to share their views and engage with OSR on how the Code could evolve.

It’s genuinely encouraging to see how many of the issues raised in those sessions are reflected in the revised Code, including the need for stronger user engagement. This direct line from member input to policy change shows the strength and influence of the statistical community when we speak together. We also submitted a formal response to the consultation, underlining our call for users to be placed at the heart of decision-making.

The Code is much more than a set of standards on paper; it’s a guide to best practice that supports better decision-making across the UK.

Independence is a central theme. In a period when the relationship between statisticians and politicians is under scrutiny globally, the Code’s emphasis on protecting impartial professional judgement is crucial. The integrity of official statistics depends on statisticians being able to work freely and professionally, and the Code provides a vital safeguard for us in the UK.

There’s always more work to do to ensure statistics fully serve the public good, and the RSS is particularly keen to see further progress on user engagement. The revised Code’s call to put users at the centre of the system provides a valuable catalyst for this, and we are looking forward to working with OSR, ONS and the wider government statistical service to discuss and agree exactly what that should mean in practice across the statistical system.

The Code matters because, above all, trust in statistics matters. The refreshed Code is a great step forward, and the RSS looks forward to supporting its use and continuing this conversation with both the OSR and our members.


Related

Code of Practice for Statistics

Code of Practice for Statistics 3.0 – What has changed

Quality Data, Shared Purpose: World Statistics Day 2025 and the Refreshed Code of Practice

In our latest guest blog, Rochelle Tractenberg explores how ethical statistics and the refreshed Code of Practice can help build public trust this World Statistics Day…

Every five years, World Statistics Day celebrates statistics’ and data science’s global contributions to evidence-informed decisions, democratic accountability, human dignity and flourishing, and sustainable development.

The theme of this World Statistics Day 2025 is “quality statistics and data for everyone”, which coincides with the Office for Statistics Regulation (OSR)’s refreshed Code of Practice for Statistics. The timing and orientation of the refreshed Code both highlight its place in promoting ethical statistics practice to help achieve this goal for the people of the UK.

While high-quality and widely available statistics and data certainly exist and are to be celebrated, these don’t just happen; they require diligence, care and competence at all levels by professionals in statistics and data science. World Statistics Day 2025 presents an opportunity to consider how we can increase the visibility of this commitment and work, and their accessibility and utility for everyone.

An under-appreciated challenge to public trust in official statistics and the statistical profession is “drift” – which refers to changes in the properties of data over time. Drift can occur in data source, associated meaning, and ability to accurately represent a concept. Drift can mean that data become less reliable or less consistent over time, impacting the value and trustworthiness of the data. As such, regularly reviewing statistics to see if they meet relevant standards – and critically, if they do not – is crucial to promoting, and in some cases renewing, trust.

I was excited to hear about the refreshed Code of Practice for Statistics – the guiding framework for all official statistics in the UK – which will be live on the Code website from 3 November 2025.

OSR has updated the Code to include broader support for anyone working with or communicating statistics, and to reflect technological advances. The Code’s core principles – Trustworthiness, Quality and Value – have not changed, but its guidelines have been made clearer and more relevant. All official statistics in the UK must meet the requirements of the Code to ensure that they serve the public good. Those that do so are granted accredited official statistics status, indicating that the statistics, and the underlying data, are of high quality. The Code is also useful for those working with data and statistics who want to voluntarily apply it as a practical framework to increase public confidence in statistical work. It can help anyone build public trust in statistics and data science.

Engagement with the Code is worthwhile at the start of, and regularly throughout, data collection and analysis, for anyone who wishes to more actively and transparently step onto the path towards ethical statistical practice. To see my full reflections on building trust in statistics and data science through ethical practices, and how the Code can contribute, please see my recent article. For additional perspectives on the importance of World Statistics Day, please see the International Statistics Institute’s statement.


Rochelle E. Tractenberg is a tenured professor at Georgetown University (Washington, DC). Her applied ethics work focuses on strengthening trustworthiness in statistics and data science across research and policy settings. A biostatistician since 1997, she serves on the UK National Statistician’s Data Ethics Advisory Committee (NSDEC), ISI Advisory Board on Ethics, and the Association of Computing Machinery Committee on Professional Ethics. She has written two books on ethical practice, and has contributed to standards for statistics, data science, and mathematics – as well as the forthcoming UN guide, Ethics in Official Statistics: Foundations and practices to foster public trust.


Related:

Quality statistics and data for everyone: Renewing trust through ethical practice

Why migration statistics still matter

In our latest blog Statistics Regulator Gillian Fairbairn discusses the complexities of tracking migration and the importance of migration statistics…

What do Canada geese, wildebeest and humpback whales all have in common?

They migrate. Every year, Africa’s wildebeest herds travel across the continent in search of greener pastures, humpback whales move from one end of the planet to the other between feeding and breeding grounds, and Canada geese migrate south every year to warmer climates during the winter months. For these animals, migration reflects a move towards resources and away from threats.

We also see migration of people. Whilst most people will live close to where they were raised, some will move across the world, seeking out employment, education or personal connections, or simply exploring the world we live in. Others move away from difficult circumstances, political instability, war or a lack of resources.

All of this moving around doesn’t just impact the individual migrants. It also affects the areas being migrated to and from. Increases in population through immigration may increase demand for local services, for example, healthcare or education, as well as increasing local economic activity. Conversely, decreases in the population through emigration may reduce the demand and staff available to local services or have economic impacts, such as changing the local skills mix. As a result, official statistics on migration attract a lot of attention and are often covered by the media and discussed by government officials. As such, the regulation of these statistics is a priority at the Office for Statistics Regulation (OSR).

Regulating migration statistics

Back in 2022, OSR published its review of migration statistics produced by the Office for National Statistics (ONS). At the time, ONS, the producer of long-term international migration estimates was in the early stages of its plans to change the data sources for its long-term international migration statistics. Whilst ONS had plans to move towards estimating migration using administrative data , the COVID-19 pandemic brought a temporary stop to the International Passenger Survey (IPS) which was the main data source for migration estimates at the time. This accelerated the urgency of moving to a different data source to meet a pressing user need for high quality migration estimates. Our report highlighted ONS’s credible and robust approach to developing migration estimates. But it also noted that improvements should be made in how ONS engages with users, and it should ensure clear and coherent communication of its plans for migration statistics. Since we published our report, ONS has made significant progress towards meeting our recommendations, particularly with its user engagement. However, it still faces challenges in the production of migration estimates, particularly around revisions and communicating uncertainty. In December 2023, we published a follow up to our earlier review. The follow-up report recognised ONS’s progress and closed most of the recommendations that we set out in our initial review.

ONS revises its estimates after publication to reflect both where assumptions are updated with actual travel data and additional methodological developments. Some of these revisions have been significant and, in line with our expectations, ONS is working to understand and explain them to users. ONS is also working to address concerns with measuring the migration of British nationals, which is currently reliant on the IPS. The IPS arrivals survey was terminated last year so ONS is exploring alternative data sources for measuring these migrants.

Casework on migration statistics

Alongside ONS’s long-term international migration estimates, the Home Office produces a wide range of statistics describing the operation of the UK’s immigration system, which include statistics on people arriving in the UK through irregular migration, asylum and visa applications. We support and engage with the Home Office regularly, particularly through our casework function, where we respond to any queries or concerns raised by anyone about the production or use of statistics, including organisations and members of the public.

A theme in our casework relating to Home Office statistics has been their transparency. For example, figures were sometimes being used in the public domain by government officials ahead of publication, which falls short of our expectations in line with the Code of Practice for Statistics where we would expect statistics to be released in an open and transparent manner that promotes public confidence. Following a constructive letter exchange between our Director General and the Permanent Secretary at the Home Office, we have seen improvements to practice, with statistics being communicated in a clear and transparent way and interim official statistics publications used where appropriate.

Migration is an important topic that is at the centre of lots of public debate, and it is important that statistics are used to get under the surface of complex policy questions. But sometimes we see instances where the Home Office statistics are used incorrectly or unclearly in public debate, which can lead to misunderstanding or misrepresentation of the data. In addition to regulating the production of statistics, we use our voice to stand up for the appropriate use of statistics. So, for example, when concerns were raised with us that ministers used incorrect asylum backlog statistics, we investigated and publicly responded to stand up for the correct use of statistics in the public domain.

Final thoughts

Combined, ONS and the Home Office are providing users with a wealth of information on who is moving in and out of the UK and why. Whilst OSR recognises the good work carried out by these official statistics producers, it is OSR’s role to ensure that these official statistics are of sufficient trustworthiness, quality and value and that they are used appropriately to inform public debate. In addition to our publications, our statistics regulators engage regularly with statistics producers to provide ongoing support and guidance, often before any issues arise. This direct engagement enables us to have a positive impact on official statistics.

Tracking the migration of any species has its challenges. Imagine trying to attach a tracking device to a humpback whale which can cover thousands of kilometres of remote and inaccessible habitats. Perhaps the expectation is that measuring human migration should be more straightforward. However, complex visa systems, entry routes and unexpected political events can create significant challenges that might encourage even the most knowledgeable migration statisticians to go fishing instead!

Advancing data linkage: key takeaways from the Administrative Data Research UK conference

Helen Miller-Bakewell, OSR’s Head of Development and Impact, reflects on the Administrative Data Research UK (ADR UK) 2025 Conference, and the takeaway messages for her and the Office for Statistics Regulation (OSR).

Recently I attended the ADR UK conference in Wales, From records to research: Harnessing administrative data to enhance lives.

It was an impressive event, with over 250 presentations across four themes, which spanned insights from research using linked administrative data; technical and social aspects of data access and linkage; and ethics and governance.

It was hugely stimulating, and, as I tried to order my thoughts on the train home, four key takeaways emerged for me:

1. Linked data are a powerful tool to help evaluate impact and inform positive action

There were so many presentations about research findings from the analysis of linked administrative data that have obvious relevance to current and future government policies. I personally heard presentations on topics as varied as the impact of Scout and Guide attendance on cognitive ability trajectories in childhood and adult health; patterns and predictors in A-level science choices in Wales; and the effects of Daylight Saving Time clock changes on health outcomes in England. But there were a huge number of other presentations, some related to health and education, as well as others relating to crime, economics and the environment.

It was exciting to see the step being made from ‘interesting analysis’ to ‘analysis with tangible impact’. Practical demonstration of the impact that data can have will doubtless remain important to secure future buy-in (and funding!) for the continued use of linked administrative data. As others noted at the conference, the current cross-cutting Government Missions offer researchers a clear opportunity to demonstrate this practical impact.

2. Data are not just numbers

Across the conference, there was widespread recognition that data are not just numbers. Rather, the numbers in data records represent individual people, sometimes at challenging points in their lives. It was encouraging and inspiring to hear about the variety of ways in which those working in the administrative data space are engaging with members of the public and involving them in decisions – at all stages of the research process, and in discussions around data access and governance.

As we become ever more ambitious about what is possible with data, the words of the Rt Hon Mark Drakeford MS, who gave an excellent keynote talk, hit home: “Taking the public with us is not something we can take for granted”. Public engagement on topics relating to data and statistics will become increasingly important.

3. We’ve come so far – but there’s still a long way to go

Those presenting at the conference recognised and celebrated the huge amount that’s been achieved so far in terms of enabling secure administrative data research. But several talks and workshops also highlighted new challenges ahead – such as being able to link data held in different secure data environments for single research projects; using secure data to train AI models; and the extraction and onward use of these trained models from secure research environments.

These ambitions raise new ethical and technical challenges – in some cases highlighting existing challenges that still need to be solved, such as reducing the time it takes to agree even ‘simple’ data access requests and conduct output reviews.

4. We still need to work on how to link data

Among the presentations I went to, few talked about the challenges associated with actually linking data. In Q&A sessions, however, challenges, and limitations in the scope of research or research findings resulting from them, did sometimes come out.

Performing high-quality data linkage can still be very challenging. So, it’s important to keep going with things like the agreement and adoption of data standards and metadata, as well as the development of linkage methodologies. I heard an interesting presentation from the Office for National Statistics on its Reference Data Management Framework, which highlighted for me how much work is still going on to try to make linkage easier and better.


Mark Dakeford on the ADR UK conference 2025
Mark Drakeford delivers an opening address at the ADR UK conference 2025

What does this mean for OSR?

I left the conference interested and enthused, wondering how OSR can best provide support in all these areas. Two key things came to mind:

  1. The refreshed Code of Practice for Statistics, due for release later this autumn, will include several practices that relate directly to data sharing and linkage. Together these practices aim to make official statistics based on linked data more common, and to make data used in official statistics more suitable and more available for linking by others as well. They touch on process, technical and social factors. ​Including these practices in the Code will give OSR a firm basis to catalyse data sharing and linkage across the UK statistical system.
  2. We will continue to work with existing partners, such as with ADR UK and PEDRI, which unites different UK organisations working with data and statistics, to improve how we all work with the public. And we’ll remain open to new partnerships in this space – especially where these have potential to help government data be made more widely available to accredited researchers beyond government.

Alongside these activities, we’ll continue to use our platform to champion greater data linkage, in a secure way, in research and statistics. Because, ultimately, this is an increasingly evidenced way to help data live up to their potential to serve the public good.

We want to hear from you

OSR is always delighted to hear about and champion work that demonstrates or enables effective data sharing, access and or linkage. If you have a case study or would like to discuss our work in this area, please get in touch: regulation@statistics.gov.uk.


Related:

A reason to be optimistic: sharing and linking data on road traffic collisions – February 2025

How government can make more data available for research and statistics in 2025 – January 2025

Data sharing and linkage for the public good: breaking down barriers – September 2024

Efficiency in Action: Improving Productivity in Public Services

In our latest blog, Head of OSR Ed Humpherson discusses measuring public services productivity in a more sophisticated way.

Efficiency reviews are a recurring feature of public administration in the UK. HM Treasury runs spending review processes that typically require departments to consider how they might become more efficient. Governments have also periodically commissioned reviews from senior business people, tasking them with finding efficiency savings. Examples include the 2004 Gershon review and the 2010 Green review.  

These reviews usually consider efficiency in a rounded way – recognising that efficiency can emerge either as a reduction in costs to deliver the same level of output or the delivery of a greater level of output for a given cost. 

But the reviews often are remembered for their focus purely on cost cutting. This is partly because of brutal fiscal arithmetic: when money is scarce, the Treasury needs to impose cost reductions. It is also often because the headlines that emerge from the reviews focus on the opportunities to cut costs. When then-Prime Minister David Cameron welcomed the Green review in 2010, he said the report ‘goes into how much money we have wasted…over the last decade’. Similarly, the Gershon review adopted a very direct focus on cost cutting through its headline: “Releasing Resources to the front line”. 

Perhaps cost cutting makes for a simpler message than the more complex “efficiency can be more outputs for the same cost” story. 

But in the long run, focusing on cost provides little insight into whether public services are becoming more or less productive. On this, the macro picture appears to be disappointing, with evidence pointing to slow growth in public sector productivity. For example, this piece by EY highlights the difference between private and public sector productivity. 

To do that properly, we need measures which relate the resources used to what’s delivered – or to put it in economic terms, measures which relate inputs to outputs. 

That’s what some new work by the Office for National Statistics (ONS) focuses on – how to measure public services productivity in a more sophisticated way. 

The traditional way to count the contribution of public sector output in many of the world’s economics is an “inputs equal outputs” approach – that is, what’s produced should be valued at the cost of production. If society puts £30 billion into education, then society gets £30 billion of value back. The trouble with this approach is that it fails to account for any increases or decreases in public sector productivity – whether resources being put in are being put to better or worse use – and this is a topic of clear public interest: are public servants delivering increasing value over time? 

For this reason, since the 2000s, the ONS has sought to measure public sector contribution by better valuing the outputs produced, both in terms of the number of outputs delivered and how these have increased in quality over time: so, for example, if the NHS delivers more procedures over time, has the average patient outcome also improved as a result of these procedures?  

This quality dimension is really important: looking at education, ministers rarely give interviews about the number of students sitting exams, but they regularly focus on whether those students earn better results. Delivering public services is rarely simply about the number of outputs: quality dominates reporting and political debate.  

This approach sets the ONS apart from many other national statistical institutes, and the direct measurement of public sector activity is one factor in why the UK’s GDP levels fell further than other countries’ during the COVID-19 pandemic – see the OECD/ONS report: International comparisons of the measurement of non-market output during the COVID-19 pandemic | OECD 

For many countries, at least in the quarterly estimates, the value of output is equal to the input; if teachers are being paid, then that is the value of output. But for the UK, the ONS was more realistic: if schools were closed with some provision being made online, then teachers were generally producing less education, even though they were likely still being paid their usual salary. Overall, COVID-19 restrictions meant education output fell, and the ONS approach reflected this. 

The ONS has now built on this approach through its public sector productivity review. This review extends the approach of valuing outputs from health and education to other sectors of government activity, including welfare administration, tax collection and public safety.  

The ONS figures show a fascinating picture – one where, for example, NHS productivity has fallen since the pandemic in the light of more-challenging workloads, but welfare administration has improved its productivity, reflecting the introduction of universal credit as a single benefit to replace multiple benefits. This is a classic productivity-improving policy when, broadly, the same output (the value of benefits paid) requires fewer inputs to administer.  

Measuring quality improvements for public sector outputs is inherently tricky. It is hard enough for statisticians to measure quality improvements for products that are bought and sold in markets, like mobile phones and computers, and there is a long tradition of methodological challenges in doing so. These challenges are all the greater for public sector outputs where, typically, there are no traded markets, and where prices cannot reveal the way consumers value the product. 

Putting that inherent difficulty to one side, though, for me, there are three benefits to the ONS approach: 

  • First, it is good to see ONS being a world leader in methodological developments.  
  • Second, the ONS productivity figures tell a more plausible story than a simple “inputs equal outputs” approach. The decline in NHS productivity seems to confirm what many other indicators tell us – a point made very well by the Institute for Government’s Gemma Tetlow in a recent appearance at the Treasury Select Committee (Q34 in this transcript).  
  • Third, and most importantly, this ONS work could pave the way for a more rounded approach to public sector productivity which gets away from the crude arithmetic of focusing on cutting expenditure. 

Let me unpack this third point further. What might a “more rounded approach” involve? The ONS figures can provide a solid base of measurement that enables government, Parliament and others, such as the National Audit Office, to track progress through time. In allowing a through-time focus, it might be possible to avoid the periodic upheaval of major reviews which have punctuated the recent past of public administration.  

And this approach might also, by its consistent nature, move away from regarding transformation as a special project, something separate from the day-to-day delivery. It might instead lead to a view of increasing efficiency and changing services as part of an ongoing and iterative process of improvement. This echoes recent evidence I presented at the Public Administration and Constitutional Affairs Select Committee hearings on the UK Statistics Authority. I was talking about the approach to transforming statistical production in the ONS and the risk that core activities can get neglected within a high-profile drive for transformative change. I explained the risk as follows: 

Transformation is a bit special and gets all the attention; the core is seen as less exciting. … the strong message is, “Do not have separate, shiny projects and programmes off to one side; align continuous improvement to the core and do it iteratively”.’ (Q153 in this transcript). 

What is a fair diagnosis of the challenges the ONS has faced maybe also provides insight at the macro level of government as a whole. Perhaps the best approaches to improvement and efficiency are iterative and embedded within ongoing delivery. 

From Trend to Tool: Elevating Dashboards with Trustworthiness, Quality and Value

The world of statistics is not immune to the rise and fall of fashionable trends. Every few years a shiny new toy emerges that will solve all of our problems…or so we think.  

One example of such a trend is dashboards. Brought to public fame during the Covid pandemic, the dashboard aims to provide information in a clear and easy-to-use format. The success of the UKHSA Covid dashboard in conveying data to the public during the pandemic demonstrated how effective this method can be for data dissemination.  

Dashboards also have other benefits, such as acting as good early warning indicators that show data changing as part of a wider picture. I like to imagine this much like the control panel in the cockpit of a plane where each button and light serves a specific function, allowing the pilots to address issues and ensure a safe and efficient flight. 

However, it is important to remember that dashboards also have some drawbacks. For example, the format often makes it harder to communicate uncertainty and can result in complex data being oversimplified. 

Whilst the initial hype of dashboards has diminished somewhat, especially now that there is a surge in interest in AI, it is clear that they will continue to be used across the statistical system. Over the past few years OSR has provided advice to official statistics producers on dashboards. We have now developed this further and produced our own dashboard guidance, aim to ensure that dashboards are produced to the highest possible standard.  

How to get the most out of dashboards 

A dashboard can be a great addition to a statistical release alongside a bulletin or article, in order to give users bespoke access to data and to help them visualise trends.   

User needs should be put at the forefront of any decision about whether a dashboard is needed and for how long it will be updated and maintained. Our ‘dashboard questions’ target the most prominent issues and challenges with dashboards and are intended to help statistics producers to consider the value of a dashboard for any statistical output. 

It is important that statisticians and analysts work closely with dashboard developers and data visualisation teams to ensure that the final product is produced to a high standard.  

Visual space can be limited within dashboards, and so there is a balance to consider between the level of information that should be included upfront and not overwhelming the user. To help with this, we have developed a three-layered communication framework that we encourage producers to adopt which is outlined in our guidance. 

Dashboards are also often used to bring together multiple sources of data into one place. A dashboard can be a useful communication tool where there is a clear user need for a dashboard as a product. Understanding how the statistics are used, and by whom, is therefore important. 

The importance of TQV 

Many of you will be familiar with the principles of the Code of Practice for Statistics: Trustworthiness, Quality and Value (TQV). We consider that these principles are universal and can be applied to almost any circumstance, including dashboards.   

Dashboards should be trustworthy (T) and present high-quality statistics (Q) that are valuable to users (V). Our new guidance, in line with TQV, is intended to support statistics producers in their development and use of dashboards. It doesn’t have all of the answers, nor cover all scenarios that producers may encounter when a dashboard is requested – but it does set out key principles that producers should adopt when making a public-facing official statistics dashboard.  

We want to encourage producers to think about the code and its pillars of Trustworthiness, Quality and Value when they weigh up the advantages, disadvantages and risks of using a dashboard to disseminate data. This, combined with data visualisation guidance from the Government Analysis Function, should give better clarity on best practice. 

 

We would like to thank everyone who provided their views and feedback that helped to shape this work. We are open to feedback on this guidance so please get in touch with us if you would like to discuss via regulation@statistics.gov.uk. 


Related:

Regulatory guidance: Dashboards

OSR’s emerging strategic themes for 2026-29

This blog sets out how our discussions with stakeholders, the recent focus on the performance of the Office for National Statistics (ONS) and the resulting evidence sessions at the Public Administration and Constitutional Affairs Select Committee (PACAC) have led us to reflect on our regulatory approach.

The core of our work at OSR is assessing whether individual statistical outputs comply with the Code of Practice for Statistics. ONS, which produces some of the most important statistics in the UK, has been experiencing a period of significant challenge. A key element of our role, as the statistics regulator, is providing support and challenge to ONS as it works to recover its economic statistics.

However, ONS is just one producer of the official statistics that we assess. We are responsible for providing assurance across the whole statistical system across the UK, which includes around 800 accredited official statistics. Only 15 per cent of these are produced by ONS. Our State of the Statistical System report, which we published in July, highlighted our conclusion that the system as a whole remains robust.

We have had conversations with a wide range of stakeholders about our role and our approach as we develop our new strategy, and PACAC has added an important and highly influential voice to our thinking. We have also discussed these emerging ideas with the UK Statistics Authority Board’s Regulation Committee, which provides strategic guidance and oversight to OSR.

The key elements of our proposed strategy, which will run from 2026 to 2029, are that:

  • we should continue to act as a rigorous regulator
  • we should take a systemic perspective and identify opportunities for system-wide improvement
  • we need to enhance our work to support the integrity of evidence in public debate

The Code of Practice at the heart of our work

The core of our regulatory model is setting clear standards for official statistics through the Code of Practice for Statistics, and forming judgements as to whether individual sets of statistics comply with the Code. We are currently updating the Code, with plans to publish a new version later in 2025. The new version will set clearer expectations on what statistics producers must do; will contain a better-defined statement of the framework of TQV (Trustworthiness, Quality and Value); and, for the first time, will include standards for the public use of statistics, data and wider analysis (what we call “intelligent transparency”).

We will use this new Code to underpin our judgements on whether statistics merit the status of accredited official statistics (AOS). These judgements are then reviewed and approved by the Regulation Committee, which provides crucial input into regulatory decision-making.

We take a balanced stance in forming our judgements. We don’t see our core purpose as being to criticise or undermine producers; in fact, we often highlight positive developments in statistics. Doing so can be just as effective as exposing weaknesses – because the endorsement empowers teams to continue to improve and innovate, and because it provides an evidence base of good practice from which others can draw.

We have assessed the designation of a wide range of statistics: including for example ONS’s statistics from the Crime Survey for England and Wales, the Census statistics (in separate reports covering Northern Ireland, Scotland and England and Wales) and, most recently, Northern Ireland’s public transport statistics.

But equally, we are willing to identify when statistics do not meet the standards of the Code of Practice. We have done this for several key statistics, for example migration statistics (2019), employment statistics (2023), the Wealth and Assets Survey (2025) and, most recently, National Records Scotland’s healthy life expectancy statistics.

Rigorous regulator

So what lessons are there for us to learn from stakeholder feedback, including the PACAC sessions?

Our first reflection is that we have made sound and appropriate decisions. These include withdrawing the status of accredited official statistics where appropriate.

As a result, we want to continue to act as a rigorous regulator, with the Code of Practice as our guiding light. By “rigorous”, we do not mean being harsh or critical as a performative stance. We mean that we form judgements through a clear and thorough consideration of the evidence, and that we will articulate our judgement on compliance with the Code in a clear and accessible way. The importance of clarity is something the Regulation Committee has emphasised to us in its oversight of OSR. So being a rigorous regulator does not mean that we will retreat from highlighting effective work. This recognition remains an important lever for us.

But we need to do more to make sure that the rigour of our judgements is clear in our publications. It also means that the requirements that we set in our reports should be specific and actionable. We have already made significant changes here. After we commissioned a review of our work from Professor Patrick Sturgis, we tightened the specificity of our requirements and the follow up. For example, in our systemic review of ONS’s economic statistics published in April 2025, we required immediate action from ONS: the publication of a survey recovery plan and an overarching plan for economic statistics. ONS published both on June 26th, alongside the Devereux review. And our report Admin-based Population Estimates for England and Wales, published in July 2024, also required an action plan from ONS within 3 months, which ONS duly published in October 2024.

This all points to a need for OSR to continue to be a rigorous regulator, by:

  • putting our judgements at the heart of our publications, press statements and other communications
  • setting clear requirements and showing clearly how we follow up on them
  • making it easier to see a running list of our outstanding recommendations

System catalyst

We don’t just look at individual sets of statistics. We look at the whole system of statistics, produced by people based in UK Government departments and agencies in Scotland, Northern Ireland and Wales.

Our coverage of the system is reflected in our regulatory work and our State of the Statistical System (SoSS) reports. The reports in 2021 and 2022 recognised – and in the 2022 report, celebrated – the statistical system’s response to the COVID-19 pandemic. By 2023 we were sounding a more concerned note – highlighting that resources were increasingly under pressure; that transformations of statistics should not come at the expense of quality; and the growing risk from declining survey response rates. By July 2024, our SoSS report was describing a system under strain, pointing out that the decline in response rates to household surveys was becoming critical and that there was a need to consider whether economic statistics were meeting user needs.

Our 2025 report recognises the quality challenges facing ONS, alongside other producers encountering challenges with survey responses. But this report also emphasises that the system as a whole remains robust.

The 2025 report also describes opportunities to enhance the production, communication and ultimately the value of official statistics, through a continued focus on improvement. This improvement focus includes exploring data linkage and the potential use of AI in official statistics.

In terms of our systemic impact, one of our most notable achievements has been to embed the recognition of TQV across the UK Government analytical system. This framework encourages statisticians and analysts to think not just of the data they are collecting but about giving assurance on their approach to producing them and also focusing directly on how users will want to use the statistics.

These concepts are not just a luxury. They are the underpinnings of public confidence in statistics, and they can offer a compass for statistics producers as they adapt their statistical estate to new demands.

We therefore want to continue to support the statistical system in the UK, both by highlighting key risks and identifying opportunities to adapt, innovate and improve.

Champion of evidence integrity

It is often remarked that we live in an age of misinformation and data abundance, in which public confidence in official sources of evidence may have eroded significantly.

OSR plays a role in addressing this environment of declining trust. The Code of Practice is itself a bulwark against the challenge of poor information. The Code’s new standards for the use of statistics set a new bar for how government makes information available publicly (formally called the Standards for the Public Use of Statistics, Data and Wider Analysis). We also have a casework function. Anyone can write to us voicing their concerns about statistics being potentially misrepresented or misused, and we will seek to clarify the appropriate interpretation of the statistics. And alongside our regulatory work, we conduct research to better our understanding of how statistics serve the public good.

But we are not the only organisation that addresses these risks, and official information is only part of the overall information ecosystem. So, we need to work with a range of other organisations to help support the integrity of evidence, such as the Royal Statistical Society, Admin Data Research UK and Sense about Science. And through our voluntary application approach, we support organisations who apply the principles of TQV beyond official statistics – contributing to evidence integrity more broadly.

Looking ahead

We see three strategic imperatives for OSR arising out of the current circumstances:

  • demonstrating that we are a rigorous regulator in our judgements and our communication of those judgements
  • being a catalyst for systemic improvement across the statistics system
  • continuing to support the integrity of evidence in public debate (through our work on intelligent transparency and our casework)

We will start to refine our strategy based on these three drivers, and propose a fully worked-up strategy to the Regulation Committee in autumn 2025.

As we flesh this strategy out, we would very much welcome feedback. Do these sound like the right themes? Have we identified the right lessons? What might we be missing? If you have any thoughts, please do get in touch with us via email regulation@Statistics.gov.uk

Good statistics are never done: why producers should never stop improving their statistics

In our latest blog, OSR Assessment Programme Lead, Siobhan, looks at the importance of continuous improvement of statistics.

Here at the Office for Statistics Regulation (OSR), we independently assess whether official statistics comply with the standards of Trustworthiness, Quality and Value in the Code of Practice for Statistics. If they do, they will be labelled accredited official statistics.

But I often get asked: I’m planning on improving my statistics. What does this mean for my accredited official statistics status – Is it ok to do this? Can I keep the accreditation?

Firstly, keep improving your statistics. It’s essential to ensuring they stay relevant. Improvement can take various forms, from adding new data sources – as the Office for National Statistics (ONS) did when it introduced rail fares and second-hand cars advertised prices data to improve the accuracy of its consumer prices and retail price inflation statistics – to introducing new questions in a survey or testing out new more-efficient systems and processes. For example, the Ministry of Justice and HM Courts & Tribunals Service aligned their methodologies to improve the coherence of Crown Court official statistics and management information. The importance of ensuring that statistics continue to have relevancy and value for users is why continuous improvement and development feature prominently in the Code – both in its current form and in the new, updated version that OSR is currently developing.

Secondly, there is always an element of risk when improving things; you never quite know what you’ll find. But that’s not a reason to not try. When you knock down a wall to renovate your house, you can’t always anticipate the gas pipe you’ll find and have to cap off. The key is to be transparent about what you find and to respond appropriately.

Thirdly, listen to and clearly communicate and engage with users, the public and experts from across the UK and internationally as part of your continuous improvement work to help generate ideas and to test out your planned improvements. For example, National Records of Scotland engaged with international census experts to successfully change and improve how it calculated its final Census 2022 estimates and communicated uncertainty. More recently, I was lucky enough to attend the Escoe conference in May, where I heard about some of the improvements to methods, data sources and definitions currently being considered by academics and government statisticians to improve the relevance of statistics for users. Earlier this year, I also observed the fascinating discussions at the UK Statistics Assembly, which brought together producers, users and potential users of statistics to discuss and advise on the statistical and data priorities for the UK.

Back to the question on accreditation…

Improving your statistics may or may not impact on your statistics’ accreditation status. If the changes are substantial enough, you will need to consider relabelling them as ‘official statistics in development’ after receiving agreement from us to remove the accreditation. If the statistics are not currently accredited, you can decide to relabel them as ‘official statistics in development’ under the guidance of your Head of Profession for Statistics.

For official statistics in development, you would need to run a development project to manage the design, implementation and evaluation of these changes. For example, the Department for Energy Security and Net Zero developed new statistics, labelled initially as official statistics in development, to monitor the Boiler Upgrade Scheme. The statistics’ journey from official statistics in development to official statistics is set out in a blog.

But many alternations will not be substantial enough to require a change to your statistic’s accreditation status. Our guide on producing official statistics in development can help you determine whether a change in status is necessary and the process to following when developing official statistics. We also have some information on scenarios that could impact on your accreditation status.

Most importantly, as I said at the start – keep improving those statistics. You can never quite know what lies ahead, but continuous improvement is vital to keep statistics relevant and to help all of us to better understand the world around us and hold government to account.

If you have any questions about improving your official statistics, please just get in touch with us by emailing regulation@statistics.gov.uk, speaking with your domain team or contacting your Head of Profession for Statistics.

 

 

 

Welcoming the new Evaluation Registry

At OSR, we support good evidence that informs the public.

Our main focus is on official statistics. But we also recognise the role that evaluation plays in providing insight into what works in government policies and programmes.

That’s why we welcome the brand new Evaluation Registry, launched by the Evaluation Task Force in March 2025. The site will provide a single home for evaluations across Government.

There is at present a lot of great evaluation work taking place across Government led by researchers, economists and other analysts. They are commissioned by their department to look at what’s being done, and to create an evidence base that helps refine, improve and challenge policy.

The issue with this, though, is twofold. First, it can be difficult to know where to find and access evaluation evidence. That in itself is a huge pity. It means that the good evaluation that gets done can sometimes languish in obscurity, and the knowledge it represents may not be accessible to a wide range of people. This inaccessibility is also not in line with the intelligent transparency that we advocate in OSR – it can help underpin public confidence if the evidence that informs decisions is made fully available.

The second issue is that there is scope to increase the coverage of evaluations across Government. In 2019, the Prime Minister’s Implementation Unit reported that only 8% of the Government Major Projects Portfolio had robust evaluation plans in place (here’s a link to the report). This has now increased to 34% in the 2023/34 GMPP portfolio (here’s a link to the report), however, there is still considerable work to be done to improve the quality and quantity of evaluation on our Government’s most complex and strategically significant projects. By making the process and practice of evaluation more transparent, the website will drive greater commissioning and take-up of evaluations.

As the Evaluation Task Force’s blog published today says, the Evaluation Registry brings together evaluation plans and reports in a single, accessible site – which as of June 2025, already contains over 1750 entries!

We have a strong partnership with the Evaluation Task Force and we will work in partnership with them to support high quality evidence. In particular, while the Evaluation Task Force will maintain and oversee the Registry, the Office for Statistics Regulation will engage with Departments where there are delays in publishing evaluations. In this way, we will support transparency and access to the knowledge base provided by evaluations. And our partnership with the Evaluation Task will support Departments to use the Evaluation Registry, and thereby provide maximum value to the public.

As my blog from March 2022 states, we love evaluation in OSR. So we’re delighted to be able to support the advent of the Registry.

Related links: The Evaluation Registry: a new home for Government evaluation

 

GDP beyond the bottom line: measuring what matters most

For all the ESCoE Conference 2025 highlights, with reflections in a blog from Paul Schreyer and other overview materials please visit their website page.

In our latest blog, Yente, a regulator in the Office for Statistics Regulation (OSR)’s Economy, Business and Trade domain, summarises a panel discussion about ‘beyond GDP’ at the recent ESCoE Conference on Economic Measurement.

At the Economic Statistics Centre of Excellence (ESCoE) Conference on Economic Measurement, which took place at King’s College London from 21–23 May, OSR’s Director General, Ed Humpherson, chaired the panel session ‘GDP beyond the bottom line: measuring what matters most’. The engaging and thought-provoking panel brought together Diane Coyle (University of Cambridge and ESCoE), Richard Heys (Office for National Statistics) and Chris Giles (Financial Times) to discuss the future of gross domestic product (GDP) and the broader landscape of economic statistics. Each panellist offered their distinct perspective on how we measure economic progress – and what we might be missing.

Richard Heys: Complement, Not Replace

Richard Heys started off by stating that while GDP remains a vital indicator that reflects the ‘supply side’ of the economy quite well, it’s increasingly clear that it doesn’t capture the full picture of economic well-being. He argued for a complementary measure (or measures) to strengthen our understanding of the demand side of the wider range of potential purposes of government spending and guide smarter policy decisions. However, he noted that alternative approaches currently lack the integrated framework that makes GDP so powerful.

Richard pointed to the new UN System of National Accounts (SNA 2025), which aims to address some of GDP’s blind spots, as a pivotal development and further UN work to go further via a new High Level Expert Group. He also highlighted the Office for National Statistics (ONS)’s efforts to create a dashboard of national well-being indicators and acknowledged the complexity involved.

Diane Coyle: GDP Is Falling Short

Diane Coyle offered a more critical view, noting her fading affection for GDP over the past decade. While she acknowledged its utility for monetary policy, she increasingly questioned its ability to reflect the real state of the ‘economy’, especially in light of major changes over the past 20 years, like the digital economy.

Diane pointed to several gaps in what GDP measures that she discusses in her recent book The Measure of Progress: Counting What Really Matters, such as unpaid digital labour, cloud infrastructure and data valuation, and hybrid work and AI-driven productivity.

She contemplated how best to give people and policymakers a sense of how the economy is doing. She praised the ONS’s work on inclusive wealth, which includes a broader range of economic activities and assets than GDP, such as unpaid household services and ecosystem services.

Chris Giles: In Defence of GDP

Chris Giles pushed back against the other panellists’ positions, defending GDP as a robust and meaningful measure. He criticised the overuse of the famous Robert F. Kennedy quote about GDP measuring “everything except that which makes life worthwhile”, arguing that GDP does correlate with key indicators. GDP highly correlates with other indicators of well-being, such as the Human Development Index, and with important non-economic outcomes, such as life expectancy and child mortality, suggesting that increases in GDP had been pivotal in driving improvements in such outcomes.

Chris warned against politicising GDP by embedding environmental or social values directly into it. He argued that GDP’s strength lies in its clarity and neutrality as a monetary measure of what’s produced that allows for debate and interpretation, rather than embedding values into the metric itself.

Audience questions

The panellists engaged in an open and lively discussion, despite their differing viewpoints. The panel also responded to a range of insightful audience questions, which helped to broaden the discussion.

One question focused on whether GDP should be updated to better reflect the digital economy. Chris supported including digital content in GDP, while Diane and Richard highlighted measurement challenges. Richard also reiterated the challenge of making many complementary measures digestible.

The conversation also touched on whether the current debate is driven by low GDP growth. Diane pointed out that concerns about GDP’s limitations have been around for years, while Chris and Richard highlighted broader societal and international efforts to rethink economic measurement.

One audience member challenged Chris’s earlier point about not bringing political values into GDP. They posed that the importance of including environmental measures in GDP isn’t ‘political’ because climate change affects everyone. Chris countered that GDP is not the right mechanism with which to draw attention to climate disasters, and that the news would focus on the environment and climate anyway – probably more so than on GDP.

Final thoughts

It seems that this debate is far from being resolved. The discussion about ‘beyond GDP’ and considering what economic measurement should look like in the 21st century was a key theme of the ESCoE conference, this panel being one of multiple sessions that discussed alternative ways to measure the economy.

Overall, the session was made so engaging and the discussion rich by the diverging viewpoints of the three panellists, who continuously respectfully challenged one another. As Ed pointed out, they could at least agree on one thing: the Robert F. Kennedy quote has got to go.