From Trend to Tool: Elevating Dashboards with Trustworthiness, Quality and Value

The world of statistics is not immune to the rise and fall of fashionable trends. Every few years a shiny new toy emerges that will solve all of our problems…or so we think.  

One example of such a trend is dashboards. Brought to public fame during the Covid pandemic, the dashboard aims to provide information in a clear and easy-to-use format. The success of the UKHSA Covid dashboard in conveying data to the public during the pandemic demonstrated how effective this method can be for data dissemination.  

Dashboards also have other benefits, such as acting as good early warning indicators that show data changing as part of a wider picture. I like to imagine this much like the control panel in the cockpit of a plane where each button and light serves a specific function, allowing the pilots to address issues and ensure a safe and efficient flight. 

However, it is important to remember that dashboards also have some drawbacks. For example, the format often makes it harder to communicate uncertainty and can result in complex data being oversimplified. 

Whilst the initial hype of dashboards has diminished somewhat, especially now that there is a surge in interest in AI, it is clear that they will continue to be used across the statistical system. Over the past few years OSR has provided advice to official statistics producers on dashboards. We have now developed this further and produced our own dashboard guidance, aim to ensure that dashboards are produced to the highest possible standard.  

How to get the most out of dashboards 

A dashboard can be a great addition to a statistical release alongside a bulletin or article, in order to give users bespoke access to data and to help them visualise trends.   

User needs should be put at the forefront of any decision about whether a dashboard is needed and for how long it will be updated and maintained. Our ‘dashboard questions’ target the most prominent issues and challenges with dashboards and are intended to help statistics producers to consider the value of a dashboard for any statistical output. 

It is important that statisticians and analysts work closely with dashboard developers and data visualisation teams to ensure that the final product is produced to a high standard.  

Visual space can be limited within dashboards, and so there is a balance to consider between the level of information that should be included upfront and not overwhelming the user. To help with this, we have developed a three-layered communication framework that we encourage producers to adopt which is outlined in our guidance. 

Dashboards are also often used to bring together multiple sources of data into one place. A dashboard can be a useful communication tool where there is a clear user need for a dashboard as a product. Understanding how the statistics are used, and by whom, is therefore important. 

The importance of TQV 

Many of you will be familiar with the principles of the Code of Practice for Statistics: Trustworthiness, Quality and Value (TQV). We consider that these principles are universal and can be applied to almost any circumstance, including dashboards.   

Dashboards should be trustworthy (T) and present high-quality statistics (Q) that are valuable to users (V). Our new guidance, in line with TQV, is intended to support statistics producers in their development and use of dashboards. It doesn’t have all of the answers, nor cover all scenarios that producers may encounter when a dashboard is requested – but it does set out key principles that producers should adopt when making a public-facing official statistics dashboard.  

We want to encourage producers to think about the code and its pillars of Trustworthiness, Quality and Value when they weigh up the advantages, disadvantages and risks of using a dashboard to disseminate data. This, combined with data visualisation guidance from the Government Analysis Function, should give better clarity on best practice. 

 

We would like to thank everyone who provided their views and feedback that helped to shape this work. We are open to feedback on this guidance so please get in touch with us if you would like to discuss via regulation@statistics.gov.uk. 


Related:

OSR’s emerging strategic themes for 2026-29

This blog sets out how our discussions with stakeholders, the recent focus on the performance of the Office for National Statistics (ONS) and the resulting evidence sessions at the Public Administration and Constitutional Affairs Select Committee (PACAC) have led us to reflect on our regulatory approach.

The core of our work at OSR is assessing whether individual statistical outputs comply with the Code of Practice for Statistics. ONS, which produces some of the most important statistics in the UK, has been experiencing a period of significant challenge. A key element of our role, as the statistics regulator, is providing support and challenge to ONS as it works to recover its economic statistics.

However, ONS is just one producer of the official statistics that we assess. We are responsible for providing assurance across the whole statistical system across the UK, which includes around 800 accredited official statistics. Only 15 per cent of these are produced by ONS. Our State of the Statistical System report, which we published in July, highlighted our conclusion that the system as a whole remains robust.

We have had conversations with a wide range of stakeholders about our role and our approach as we develop our new strategy, and PACAC has added an important and highly influential voice to our thinking. We have also discussed these emerging ideas with the UK Statistics Authority Board’s Regulation Committee, which provides strategic guidance and oversight to OSR.

The key elements of our proposed strategy, which will run from 2026 to 2029, are that:

  • we should continue to act as a rigorous regulator
  • we should take a systemic perspective and identify opportunities for system-wide improvement
  • we need to enhance our work to support the integrity of evidence in public debate

The Code of Practice at the heart of our work

The core of our regulatory model is setting clear standards for official statistics through the Code of Practice for Statistics, and forming judgements as to whether individual sets of statistics comply with the Code. We are currently updating the Code, with plans to publish a new version later in 2025. The new version will set clearer expectations on what statistics producers must do; will contain a better-defined statement of the framework of TQV (Trustworthiness, Quality and Value); and, for the first time, will include standards for the public use of statistics, data and wider analysis (what we call “intelligent transparency”).

We will use this new Code to underpin our judgements on whether statistics merit the status of accredited official statistics (AOS). These judgements are then reviewed and approved by the Regulation Committee, which provides crucial input into regulatory decision-making.

We take a balanced stance in forming our judgements. We don’t see our core purpose as being to criticise or undermine producers; in fact, we often highlight positive developments in statistics. Doing so can be just as effective as exposing weaknesses – because the endorsement empowers teams to continue to improve and innovate, and because it provides an evidence base of good practice from which others can draw.

We have assessed the designation of a wide range of statistics: including for example ONS’s statistics from the Crime Survey for England and Wales, the Census statistics (in separate reports covering Northern Ireland, Scotland and England and Wales) and, most recently, Northern Ireland’s public transport statistics.

But equally, we are willing to identify when statistics do not meet the standards of the Code of Practice. We have done this for several key statistics, for example migration statistics (2019), employment statistics (2023), the Wealth and Assets Survey (2025) and, most recently, National Records Scotland’s healthy life expectancy statistics.

Rigorous regulator

So what lessons are there for us to learn from stakeholder feedback, including the PACAC sessions?

Our first reflection is that we have made sound and appropriate decisions. These include withdrawing the status of accredited official statistics where appropriate.

As a result, we want to continue to act as a rigorous regulator, with the Code of Practice as our guiding light. By “rigorous”, we do not mean being harsh or critical as a performative stance. We mean that we form judgements through a clear and thorough consideration of the evidence, and that we will articulate our judgement on compliance with the Code in a clear and accessible way. The importance of clarity is something the Regulation Committee has emphasised to us in its oversight of OSR. So being a rigorous regulator does not mean that we will retreat from highlighting effective work. This recognition remains an important lever for us.

But we need to do more to make sure that the rigour of our judgements is clear in our publications. It also means that the requirements that we set in our reports should be specific and actionable. We have already made significant changes here. After we commissioned a review of our work from Professor Patrick Sturgis, we tightened the specificity of our requirements and the follow up. For example, in our systemic review of ONS’s economic statistics published in April 2025, we required immediate action from ONS: the publication of a survey recovery plan and an overarching plan for economic statistics. ONS published both on June 26th, alongside the Devereux review. And our report Admin-based Population Estimates for England and Wales, published in July 2024, also required an action plan from ONS within 3 months, which ONS duly published in October 2024.

This all points to a need for OSR to continue to be a rigorous regulator, by:

  • putting our judgements at the heart of our publications, press statements and other communications
  • setting clear requirements and showing clearly how we follow up on them
  • making it easier to see a running list of our outstanding recommendations

System catalyst

We don’t just look at individual sets of statistics. We look at the whole system of statistics, produced by people based in UK Government departments and agencies in Scotland, Northern Ireland and Wales.

Our coverage of the system is reflected in our regulatory work and our State of the Statistical System (SoSS) reports. The reports in 2021 and 2022 recognised – and in the 2022 report, celebrated – the statistical system’s response to the COVID-19 pandemic. By 2023 we were sounding a more concerned note – highlighting that resources were increasingly under pressure; that transformations of statistics should not come at the expense of quality; and the growing risk from declining survey response rates. By July 2024, our SoSS report was describing a system under strain, pointing out that the decline in response rates to household surveys was becoming critical and that there was a need to consider whether economic statistics were meeting user needs.

Our 2025 report recognises the quality challenges facing ONS, alongside other producers encountering challenges with survey responses. But this report also emphasises that the system as a whole remains robust.

The 2025 report also describes opportunities to enhance the production, communication and ultimately the value of official statistics, through a continued focus on improvement. This improvement focus includes exploring data linkage and the potential use of AI in official statistics.

In terms of our systemic impact, one of our most notable achievements has been to embed the recognition of TQV across the UK Government analytical system. This framework encourages statisticians and analysts to think not just of the data they are collecting but about giving assurance on their approach to producing them and also focusing directly on how users will want to use the statistics.

These concepts are not just a luxury. They are the underpinnings of public confidence in statistics, and they can offer a compass for statistics producers as they adapt their statistical estate to new demands.

We therefore want to continue to support the statistical system in the UK, both by highlighting key risks and identifying opportunities to adapt, innovate and improve.

Champion of evidence integrity

It is often remarked that we live in an age of misinformation and data abundance, in which public confidence in official sources of evidence may have eroded significantly.

OSR plays a role in addressing this environment of declining trust. The Code of Practice is itself a bulwark against the challenge of poor information. The Code’s new standards for the use of statistics set a new bar for how government makes information available publicly (formally called the Standards for the Public Use of Statistics, Data and Wider Analysis). We also have a casework function. Anyone can write to us voicing their concerns about statistics being potentially misrepresented or misused, and we will seek to clarify the appropriate interpretation of the statistics. And alongside our regulatory work, we conduct research to better our understanding of how statistics serve the public good.

But we are not the only organisation that addresses these risks, and official information is only part of the overall information ecosystem. So, we need to work with a range of other organisations to help support the integrity of evidence, such as the Royal Statistical Society, Admin Data Research UK and Sense about Science. And through our voluntary application approach, we support organisations who apply the principles of TQV beyond official statistics – contributing to evidence integrity more broadly.

Looking ahead

We see three strategic imperatives for OSR arising out of the current circumstances:

  • demonstrating that we are a rigorous regulator in our judgements and our communication of those judgements
  • being a catalyst for systemic improvement across the statistics system
  • continuing to support the integrity of evidence in public debate (through our work on intelligent transparency and our casework)

We will start to refine our strategy based on these three drivers, and propose a fully worked-up strategy to the Regulation Committee in autumn 2025.

As we flesh this strategy out, we would very much welcome feedback. Do these sound like the right themes? Have we identified the right lessons? What might we be missing? If you have any thoughts, please do get in touch with us via email regulation@Statistics.gov.uk

Good statistics are never done: why producers should never stop improving their statistics

In our latest blog, OSR Assessment Programme Lead, Siobhan, looks at the importance of continuous improvement of statistics.

Here at the Office for Statistics Regulation (OSR), we independently assess whether official statistics comply with the standards of Trustworthiness, Quality and Value in the Code of Practice for Statistics. If they do, they will be labelled accredited official statistics.

But I often get asked: I’m planning on improving my statistics. What does this mean for my accredited official statistics status – Is it ok to do this? Can I keep the accreditation?

Firstly, keep improving your statistics. It’s essential to ensuring they stay relevant. Improvement can take various forms, from adding new data sources – as the Office for National Statistics (ONS) did when it introduced rail fares and second-hand cars advertised prices data to improve the accuracy of its consumer prices and retail price inflation statistics – to introducing new questions in a survey or testing out new more-efficient systems and processes. For example, the Ministry of Justice and HM Courts & Tribunals Service aligned their methodologies to improve the coherence of Crown Court official statistics and management information. The importance of ensuring that statistics continue to have relevancy and value for users is why continuous improvement and development feature prominently in the Code – both in its current form and in the new, updated version that OSR is currently developing.

Secondly, there is always an element of risk when improving things; you never quite know what you’ll find. But that’s not a reason to not try. When you knock down a wall to renovate your house, you can’t always anticipate the gas pipe you’ll find and have to cap off. The key is to be transparent about what you find and to respond appropriately.

Thirdly, listen to and clearly communicate and engage with users, the public and experts from across the UK and internationally as part of your continuous improvement work to help generate ideas and to test out your planned improvements. For example, National Records of Scotland engaged with international census experts to successfully change and improve how it calculated its final Census 2022 estimates and communicated uncertainty. More recently, I was lucky enough to attend the Escoe conference in May, where I heard about some of the improvements to methods, data sources and definitions currently being considered by academics and government statisticians to improve the relevance of statistics for users. Earlier this year, I also observed the fascinating discussions at the UK Statistics Assembly, which brought together producers, users and potential users of statistics to discuss and advise on the statistical and data priorities for the UK.

Back to the question on accreditation…

Improving your statistics may or may not impact on your statistics’ accreditation status. If the changes are substantial enough, you will need to consider relabelling them as ‘official statistics in development’ after receiving agreement from us to remove the accreditation. If the statistics are not currently accredited, you can decide to relabel them as ‘official statistics in development’ under the guidance of your Head of Profession for Statistics.

For official statistics in development, you would need to run a development project to manage the design, implementation and evaluation of these changes. For example, the Department for Energy Security and Net Zero developed new statistics, labelled initially as official statistics in development, to monitor the Boiler Upgrade Scheme. The statistics’ journey from official statistics in development to official statistics is set out in a blog.

But many alternations will not be substantial enough to require a change to your statistic’s accreditation status. Our guide on producing official statistics in development can help you determine whether a change in status is necessary and the process to following when developing official statistics. We also have some information on scenarios that could impact on your accreditation status.

Most importantly, as I said at the start – keep improving those statistics. You can never quite know what lies ahead, but continuous improvement is vital to keep statistics relevant and to help all of us to better understand the world around us and hold government to account.

If you have any questions about improving your official statistics, please just get in touch with us by emailing regulation@statistics.gov.uk, speaking with your domain team or contacting your Head of Profession for Statistics.

 

 

 

Welcoming the new Evaluation Registry

At OSR, we support good evidence that informs the public.

Our main focus is on official statistics. But we also recognise the role that evaluation plays in providing insight into what works in government policies and programmes.

That’s why we welcome the brand new Evaluation Registry, launched by the Evaluation Task Force in March 2025. The site will provide a single home for evaluations across Government.

There is at present a lot of great evaluation work taking place across Government led by researchers, economists and other analysts. They are commissioned by their department to look at what’s being done, and to create an evidence base that helps refine, improve and challenge policy.

The issue with this, though, is twofold. First, it can be difficult to know where to find and access evaluation evidence. That in itself is a huge pity. It means that the good evaluation that gets done can sometimes languish in obscurity, and the knowledge it represents may not be accessible to a wide range of people. This inaccessibility is also not in line with the intelligent transparency that we advocate in OSR – it can help underpin public confidence if the evidence that informs decisions is made fully available.

The second issue is that there is scope to increase the coverage of evaluations across Government. In 2019, the Prime Minister’s Implementation Unit reported that only 8% of the Government Major Projects Portfolio had robust evaluation plans in place (here’s a link to the report). This has now increased to 34% in the 2023/34 GMPP portfolio (here’s a link to the report), however, there is still considerable work to be done to improve the quality and quantity of evaluation on our Government’s most complex and strategically significant projects. By making the process and practice of evaluation more transparent, the website will drive greater commissioning and take-up of evaluations.

As the Evaluation Task Force’s blog published today says, the Evaluation Registry brings together evaluation plans and reports in a single, accessible site – which as of June 2025, already contains over 1750 entries!

We have a strong partnership with the Evaluation Task Force and we will work in partnership with them to support high quality evidence. In particular, while the Evaluation Task Force will maintain and oversee the Registry, the Office for Statistics Regulation will engage with Departments where there are delays in publishing evaluations. In this way, we will support transparency and access to the knowledge base provided by evaluations. And our partnership with the Evaluation Task will support Departments to use the Evaluation Registry, and thereby provide maximum value to the public.

As my blog from March 2022 states, we love evaluation in OSR. So we’re delighted to be able to support the advent of the Registry.

Related links: The Evaluation Registry: a new home for Government evaluation

 

GDP beyond the bottom line: measuring what matters most

For all the ESCoE Conference 2025 highlights, with reflections in a blog from Paul Schreyer and other overview materials please visit their website page.

In our latest blog, Yente, a regulator in the Office for Statistics Regulation (OSR)’s Economy, Business and Trade domain, summarises a panel discussion about ‘beyond GDP’ at the recent ESCoE Conference on Economic Measurement.

At the Economic Statistics Centre of Excellence (ESCoE) Conference on Economic Measurement, which took place at King’s College London from 21–23 May, OSR’s Director General, Ed Humpherson, chaired the panel session ‘GDP beyond the bottom line: measuring what matters most’. The engaging and thought-provoking panel brought together Diane Coyle (University of Cambridge and ESCoE), Richard Heys (Office for National Statistics) and Chris Giles (Financial Times) to discuss the future of gross domestic product (GDP) and the broader landscape of economic statistics. Each panellist offered their distinct perspective on how we measure economic progress – and what we might be missing.

Richard Heys: Complement, Not Replace

Richard Heys started off by stating that while GDP remains a vital indicator that reflects the ‘supply side’ of the economy quite well, it’s increasingly clear that it doesn’t capture the full picture of economic well-being. He argued for a complementary measure (or measures) to strengthen our understanding of the demand side of the wider range of potential purposes of government spending and guide smarter policy decisions. However, he noted that alternative approaches currently lack the integrated framework that makes GDP so powerful.

Richard pointed to the new UN System of National Accounts (SNA 2025), which aims to address some of GDP’s blind spots, as a pivotal development and further UN work to go further via a new High Level Expert Group. He also highlighted the Office for National Statistics (ONS)’s efforts to create a dashboard of national well-being indicators and acknowledged the complexity involved.

Diane Coyle: GDP Is Falling Short

Diane Coyle offered a more critical view, noting her fading affection for GDP over the past decade. While she acknowledged its utility for monetary policy, she increasingly questioned its ability to reflect the real state of the ‘economy’, especially in light of major changes over the past 20 years, like the digital economy.

Diane pointed to several gaps in what GDP measures that she discusses in her recent book The Measure of Progress: Counting What Really Matters, such as unpaid digital labour, cloud infrastructure and data valuation, and hybrid work and AI-driven productivity.

She contemplated how best to give people and policymakers a sense of how the economy is doing. She praised the ONS’s work on inclusive wealth, which includes a broader range of economic activities and assets than GDP, such as unpaid household services and ecosystem services.

Chris Giles: In Defence of GDP

Chris Giles pushed back against the other panellists’ positions, defending GDP as a robust and meaningful measure. He criticised the overuse of the famous Robert F. Kennedy quote about GDP measuring “everything except that which makes life worthwhile”, arguing that GDP does correlate with key indicators. GDP highly correlates with other indicators of well-being, such as the Human Development Index, and with important non-economic outcomes, such as life expectancy and child mortality, suggesting that increases in GDP had been pivotal in driving improvements in such outcomes.

Chris warned against politicising GDP by embedding environmental or social values directly into it. He argued that GDP’s strength lies in its clarity and neutrality as a monetary measure of what’s produced that allows for debate and interpretation, rather than embedding values into the metric itself.

Audience questions

The panellists engaged in an open and lively discussion, despite their differing viewpoints. The panel also responded to a range of insightful audience questions, which helped to broaden the discussion.

One question focused on whether GDP should be updated to better reflect the digital economy. Chris supported including digital content in GDP, while Diane and Richard highlighted measurement challenges. Richard also reiterated the challenge of making many complementary measures digestible.

The conversation also touched on whether the current debate is driven by low GDP growth. Diane pointed out that concerns about GDP’s limitations have been around for years, while Chris and Richard highlighted broader societal and international efforts to rethink economic measurement.

One audience member challenged Chris’s earlier point about not bringing political values into GDP. They posed that the importance of including environmental measures in GDP isn’t ‘political’ because climate change affects everyone. Chris countered that GDP is not the right mechanism with which to draw attention to climate disasters, and that the news would focus on the environment and climate anyway – probably more so than on GDP.

Final thoughts

It seems that this debate is far from being resolved. The discussion about ‘beyond GDP’ and considering what economic measurement should look like in the 21st century was a key theme of the ESCoE conference, this panel being one of multiple sessions that discussed alternative ways to measure the economy.

Overall, the session was made so engaging and the discussion rich by the diverging viewpoints of the three panellists, who continuously respectfully challenged one another. As Ed pointed out, they could at least agree on one thing: the Robert F. Kennedy quote has got to go.

Mission possible? Statistics and data in the UK Government’s mission-led approach to government

In this blog, Head of OSR, Ed Humpherson explores the role of statistics in the missions set out in the Government’s  Plan for Change, reflecting OSR’s core focus on transparency of statistics and data use in Government.

The UK Government has launched an ambitious programme of mission-led government, built around five missions and three foundations. This blog considers these missions from the perspective of statistics and data to inform the public

The five missions and three foundations are set out in the Government’s Plan for Change. The foundations are:

  • Economic Stability
  • Secure Borders
  • National Security

The missions are:

  • Kickstarting Economic Growth
  • Safer Streets
  • An NHS Fit for the Future
  • Make Britain a Clean Energy Superpower
  • Break Down Barriers to Opportunity

The purpose of these missions is to orientate and coordinate government around high level ambitions. The missions are big, high level goals, as opposed to specific Election manifesto delivery commitments. As such they provide a focus for action, and require collaboration across Departmental boundaries.

Each of the missions is underpinned by statistics. Statistics are used to: define the problem and set milestones for progress; monitor progress; and identify links and key drivers of outcomes.

At OSR we are interested in how the missions draw on and use statistics. In particular, we are keen to understand the role of statistics in the detailed mission plans that underpin each mission; and the quality and accessibility of statistics and data. This reflects our core focus on transparency of statistics and data use in Government.

The first point to make about the missions is that while they are a new tool for the UK Government, programmes for Government have already existed in Scotland, Wales and Northern Ireland for some time. These programmes are often underpinned by statistical frameworks.

For example, The Northern Ireland Executive has the Programme for Government, with four cross-cutting Missions (People, Planet, Prosperity and Peace). It’s supported by the PfG Wellbeing Framework and the award-winning NISRA Wellbeing Dashboard. All the indicators used in the Framework are official statistics.

Similar resources are available for Scotland in the Programme for Government 2025 to 2026 – gov.scot; and through Wales’s https://www.gov.wales/programme-government.

Secondly, we have considered the role of statistics and data in each of the five missions and for the secure borders foundation, drawing on our principles of intelligent transparency: asking whether the statistics available and accessible to the public, and whether it is clear what the statistics cover.

Applying these questions to the five missions:

  • The growth mission draws heavily on economic statistics produced by the Office for National Statistics (ONS). It sets two measures of economic growth as milestones: Real GDP per head; and Real Household Disposable Income (RHDI) per head. These measures are both reported in ONS’s GDP quarterly national accounts. Both measures make use of ONS population estimates and projections (see this blog by Mary Gregory of ONS) to calculate the “per head” figure. For RHDI per head, another question surrounds how easy it is for a member of the public to access and understand these figures. We are discussing both issues with ONS.
  • In addition, the growth mission sets a target for housebuilding based on official statistics. The mission targets the creation of 1.5 million new homes by the end of this Parliament. There is a clear statistic, published by the Ministry of Housing, Communities and Local Government (MHCLG), called Net Additional Dwellings, which provides a sound basis for measuring progress. The most recent release published in November 2024 covers housebuilding in between April 2023 and March 2024 in England. In addition, for monitoring progress, including specifically for the parliamentary term, Government has developed a more timely, now-cast style indicator using Energy Performance Certificate data, which MHCLG now publishes in its quarterly indicators of new housing supply statistics.
  • Net zero – there are a wide-range of official statistics available. Progress against this mission will be measured by the Government being on track to achieving at least 95% of low carbon electricity generation by 2030, in line with its Clean Power 2030 action plan. The Department for Energy Security and Net Zero publishes accredited official statistics on Energy Trends each quarter, which do track the extent of low carbon electricity generation in the UK. However, this is quite a technical area, and users at this year’s UK Statistics Assembly expressed a general appetite for statistics relating to monitoring net-zero progress and climate change to be clearer and made more easily accessible to the public.
  • Safer streets – there are lots of available data from the Crime Survey of England and Wales, and from policing statistics. But this area of the missions is seeing the most activity in terms of developing new official statistics, including the development by ONS of new measures associated with Violence Against Women and Girls. In addition, ONS is developing a new survey to understand the prevalence of child abuse in children and young people aged 11-25 years, which may also inform the Opportunities mission. The Home Office is developing new official statistics to measure progress against the target to recruit 13,000 more officers into neighbourhood policing roles (the Neighbourhood Policing Guarantee). Note that the main measures focus on England and Wales, rather than the whole of the UK.
  • Opportunities – A range of existing official statistics are available from DfE and across government to measure the four pillars that underpin this mission, although some metrics will require more detailed breakdowns of the data. DfE is also publishing additional content to support analysis and public understanding, such as ad-hoc statistics releases (including on Schools Eligible for RISE intervention) and new management information releases (including on schools in the breakfast clubs early adopters scheme). The mission has connections across the UK – in key areas like child poverty and in developing and exchanging best practice.  The education elements focus primarily on young people in England, reflecting the fact that education is a devolved matter.
  • Health – there are a wide range of health system metrics that are already available. We will consider undertaking compliance reviews of the key metrics over the next couple of years. As with safer streets and opportunities, the main milestone set out in the Government’s Plan for Change covers NHS waiting time performance in England, and therefore does not cover other parts of the UK. And it is important to bear in mind the points made in my recent blog on difference between population health and system performance.

In terms of the Safer Border foundation, the focus on migration has seen lots of statistical developments in recent years which we have covered in our regulatory work – for example, here. And the Government’s recent announcements on migration draws heavily on published statistics, as can be seen in the recent White Paper Restoring Control over the Immigration System May 2025.

Conclusion

The missions are an important feature of the UK Government’s programme. They represent a key set of commitments from UK Government, and it is important that they are underpinned by reliable and accessible data. We will do further work to provide assurance on the relevant statistics over the coming years. As this blog highlights, the missions mix some policy areas that are UK-wide with others that focus on England (housing, health and education) or England and Wales (crimes/safer streets). With forthcoming elections in Wales and Scotland in mind, we will also consider how best to help voters navigate data and statistics in the run up to these elections, as we did last year with the UK General Election – see our Election 2024 hub for an overview of our work during that campaign period. We will also consider the questions raised by this blog for the new programmes for Government in Wales and Scotland after next year’s elections.

 

Demand on the NHS is a poor proxy for understanding the UK’s health

I recently spent the most interesting 20 minutes of my week when I was asked to review OSR’s recently published article on health inequalities. The article itself is an excellent read, but it also led me to play around with the Fingertips tool, an online platform that allows users to easily access and analyse a range of public health data. While exploring this tool, I generated this chart:

The chart shows the prevalence of obesity in children entering reception year (those aged 4 or 5) in England since 2007/08, organised by the deciles of multiple deprivation. It tells a clear story: there is an obvious spike in obesity in 2020/21, and the spike is more significant for children living in the most-deprived areas. (The data come from the Office for Health Improvement and Disparities Health Inequalities dashboard. To recreate the chart, go to the child health domain; select Reception: Prevalence of obesity; and select Deprivation as the inequality indicator).

These data, and the story they tell, are valuable in and of themselves. But they also illustrate the richness and value of this public health data set. There is a wealth of data available on health inequalities in the UK – and that’s what I want to talk about.

There is perennial interest in health data among policymakers, the media and the public. The performance of the NHS, reflected in data, is a mainstay of political party manifesto commitments. Relevant figures are frequently quoted to illustrate the challenges and successes of the UK’s health systems.

For example, when the Department of Health and Social Care announced the abolition of NHS England, the Secretary of State for Health highlighted the delivery of 2 million extra appointments since the 2024 General Election and a reduction in waiting lists by 193,000.

Similarly, in Scotland, the Scottish Government has made new commitments to improve NHS performance. These commitments include ensuring that no one waits more than 12 months for a new outpatient appointment or inpatient case and the delivery of over 150,000 extra appointments and procedures in the coming year.

Metrics of health system performance also feature heavily in the UK Government’s missions. Its health mission is explicitly framed in terms of the performance of the NHS (in England), and its leading milestone focuses on reducing waiting times for elective treatment, with the aim that 92% of patients in England should wait no longer than 18 weeks for elective treatment.

Evidently, the focus on the metrics of NHS performance is widespread, and understandably so: these system metrics are important. Many people are concerned about their experience as a patient in health services, so it’s not surprising that so much public conversation focuses on their delivery.

Yet while these metrics say a lot about the NHS as a system of service delivery, they say less about the general health of the population – other than perhaps offering the sense that growing demands on the NHS’s services may reflect underlying health conditions in an ageing population.

But regarding demand on services as a proxy for underlying health is a poor measure at best. It doesn’t indicate whether the UK’s health is improving or worsening for different age groups, in different places, at different levels of income. (Indeed, one of the Labour Party’s 2024 manifesto aspirations was to halve the gap in life expectancy between the richest and the poorest in society, which focuses more on health outcomes as opposed to NHS performance.)

Moreover, demographic breakdowns can add significant value to statistics about the health system. In fact, these are something we often hear users request. But such breakdowns are missing from the standard metrics that summarise aggregate performance. These focus on the big national numbers – the total number of people waiting, or the total number of operations.

In short, focusing on the NHS’s effectiveness in delivering specific outputs doesn’t highlight health inequalities very well. And there is a risk that focusing on system metrics drives interventions that focus on improving these numbers, not underlying population health.

That brings me back to OSR’s review of health inequalities. This wider question of data on health inequalities is the focus of this recent review. Our article shows that there are many data sources available across the UK, as well as the Fingertips tool with which I generated the school age obesity chart. These include:

  • the online profiles tool for Scotland, which provides access to a huge range of indicators of public health, including drugs, alcohol and tobacco use, mental health, and life expectancy, and in the future will include physical exercise
  • the annual Well Being of Wales report, which places milestones like healthy life expectancy and health lifestyles alongside other indicators like income and education
  • Northern Ireland’s health inequalities statistics, which provide an annual update on health inequalities in Northern Ireland

The variety of available data tools and sources indicates that, despite the demands to report on NHS performance, health bodies are able to carve out sufficient time and resource to provide clear analysis of how health differs across the population. However, for these public health data resources to meet their full potential, they need to do more than merely exist; it’s also important that they are used and referred to in debates about health. Government, citizens and the media certainly want to understand how the NHS is performing, and there are good data that can help us determine this across the UK. But to enhance the underlying health of the nation, we need broader data that focus on people and their health – not just systems.

So, at OSR we aim to continue to focus on the high-profile system metrics – but to balance this focus with a wider perspective on public health. In particular, we want to celebrate the value and power of the available tools to understand, analyse and address the health of the population.

Because, as my graph on obesity in reception-age children shows, you can discover the fundamentals of the population’s health without ever going near a hospital performance league table.

How official statistics shape personal decisions

In our latest blog post, OSR’s Head of Research describes a recent project which explored how individuals use statistics in their personal lives, leading to practical recommendations for statistics producers.

Like many expectant parents, I have spent a lot of time over the past few months trying to decide what to name my son. Given that I work at the Office for Statistics Regulation (OSR), you won’t be surprised to hear that I turned to baby name statistics to help guide my decision. What may surprise you, however, is that it’s not just people that work at the statistics regulator who do this! In fact, our recent research has shown that I am far from alone in using official statistics to make personal decisions.

Real-life examples

Our study, conducted in collaboration with the Policy Institute at King’s College London and the Behavioural Insights Team, revealed numerous instances where individuals used official statistics to inform personal decisions. For example, many expectant parents, like me, consulted baby name lists to find the perfect name for their children. Some sought names that were not too popular, while others looked for names that had stood the test of time. Our research found that use of official statistics extends beyond baby names. For example, people also relied on crime statistics and school performance data when choosing where to live or which school to apply to for their child, demonstrating the broad impact of official statistics on personal decision-making.

Unseen influence

Interestingly, while half the respondents in the survey part of our research reported having used official statistics to make a personal decision at some point in their lives, the interview part of our research found that this may be an undercount: people often used official statistics without realising it. We heard that when participants searched for information online, they frequently encountered official statistics through Google searches or social media, even if they didn’t recognise them as such. This means that you may have personally used an official statistic to help with your own decision making, even if you didn’t know it at the time.

Potential influence

Beyond the hidden influence of official statistics is their potential influence: when we showed individuals official statistics, they often spoke about how they would have wanted to use them, if they had been aware of them. For example, statistics about patient experience with GPs were welcomed as an ‘objective version of online reviews’. This shows how important it is to make sure official statistics are promoted to all audiences that could benefit from them.

When people used official statistics

It won’t surprise you to hear that not everyone who participated in our research used official statistics in their decision making, and even of those who had used official statistics they didn’t use them all the time. Often people used statistics in combination with drawing on personal experiences, or advice from family and friends. We saw that people used statistics when:

  • decisions were particularly important to them
  • the choice wasn’t too emotional
  • the statistics were very relevant to their personal and local context
  • they trusted official statistics
  • the statistics were easy to understand and clearly explained.

Perhaps this is why I, and many others, found baby name statistics so valuable – we heard from one participant that they put lots of effort into choosing a baby name because it is such a big decision:

‘It was quite hard because you just think that’s the name that the baby has to have for the rest of his life.’

OSR’s wider research programme

This research project is part of OSR’s broader research programme, which aims to identify what kinds of changes are needed, and by whom, to ensure that statistics are fully serving the public good. In this context, serving the public good means ensuring that statistics are a tool that can benefit everyone, rather than something that is exclusively for public bodies. By exploring how individuals use statistics in their personal lives, we can better support the development of statistics that truly meet the needs of the public.

Next steps

The findings from this research have led to several recommendations for statistics producers to help make official statistics suitable for everyday decision-making by members of the public. These recommendations that could help producers to:

  1. increase public awareness of official statistics
  2. improve the relevance of official statistics
  3. improve trust in official statistics
  4. improve the clarity of official statistics
  5. balance detail with simplicity
  6. make the case for official statistics in personal decision-making

At OSR, we are committed to supporting statistics producers to take these recommendations on board.

Overseeing this project while I choose my own son’s name has really brought to life for me the role of statistics in serving the public good when navigating life’s most significant moments, and I hope that sharing this has helped illustrate it for you too. After all, the power of statistics lies not only in shaping policy decisions that may feel distant, but also right on our doorstep, by enriching our everyday lives.

Beyond GDP? Yes – but how far?

In our latest blog, OSR’s Jonathan Price introduces his recently published think piece on value and GDP.

Ed’s recent blog exploring GDP prompted me to look again at ONS’s recent inclusive income release. Inclusive income aims to complement GDP, which most would regard as inadequate as a single measure of society’s progress. For instance, GDP does not capture many kinds of worthwhile activity that are not paid for (caring for a relative, for example) but does capture activities that many would not regard as beneficial (like trading in illicit drugs).

ONS states that “Inclusive income estimates provide a broader measure of the economic welfare of the UK population. They reflect the economic value of both paid activity, included in gross domestic product (GDP), and unpaid activity, which includes ecosystem services and unpaid household services. The result is measures of economic progress that include activity and assets beyond those currently included in GDP.”

This got me thinking about the activities we do in life that are unpaid, and the personal activities that add value to our lives but cannot be measured – at least not directly. My reflections made me recall the first line of the poem “Leisure” by the Newport poet W H Davies:

“What is this life if, full of care, we have no time to stand and stare.”

This line, and indeed the whole poem, encourages us to stand back from the pressures of everyday life and to take time to observe the world we inhabit. The poem singles out the natural world as being particularly worthy of our full attention.

As a result of my reflections, I have written a think piece about the value of some personal activities and aspects of our environment.

The think piece explores the value we get from personal activities, including just standing and staring, and from aspects of our engagement with the natural world, and how far this value can be properly measured and expressed in monetary terms.

It does not pretend to be an academic study and does not include a review of other writing or research on the topic (which is extensive). Instead, the piece is a personal reflection that starts from first principles and is intended to prompt thought and discussion rather than to arrive at definite conclusions.

I’d very much welcome comments, and indeed criticisms, on my reflections as I hope is made clear, I think the issues discussed are ones where reasonable people can reasonably disagree. Please email your thoughts to me at regulation@statistics.gov.uk.

The Code, The Key and (for fans of 90s dance music) The Secret

In our latest guest blog, Paul Matthews, Head of Profession for Statistics in Scottish Government, responsible for capability and capacity of the statistics profession, talks about his passion for improvement and how the system can make the statistics that are produced better and have more impact. This blog coincides with the closing of our consultation on proposed changes to the Code of Practice for Statistics, for which we plan to present our findings in the coming months.

I hear a lot of myths about the Code of Practice for Statistics. I hear things like:

  • ‘I know [insert topic here] is very relevant at the moment, but we haven’t preannounced so we can’t publish for at least 4 weeks, because that’s what the Code says’, or
  • ‘We will have issues with the Code of Practice and trustworthiness if we break a time series’, or
  • ‘We need to publish this as a management information release because the Code won’t allow us to publish as official statistics due to quality’.

In these examples, we are thinking of the Code as telling us what we can’t do. I’m not sure why that is. Maybe we tend to think of it as the rule book that we must obey. Maybe it’s because having the ‘rules’ is comforting for us as statistics producers and can give defence if we are challenged.

A key, not a lock

Rather than seeing the Code as telling us what we can’t do, I see it as an enabler to tell us what we can. In other words, it is a key that facilitates the practical release of statistics that provide value for society rather than a lock that prevents us from being responsive and innovative. And this is equally true for the existing version of the Code of Practice and the draft Code 3.0.

Thinking of the Code as a key isn’t carte blanche for us to do whatever we want. There are still risks we need to work through. But in my experience, the Code tends to be supportive of sensible pragmatic things for users that help build trust and transparency rather than being about protocol for protocol’s sake.

Using the Code as a key

I spent a lot of time looking at the Code of Practice when I developed statistical strategic priorities for the Scottish Government Statistics Group. The priorities are about how we can improve statistical work to focus on what provides the greatest value in producing statistics for the public good. It means that there are things we will need to deprioritise given our finite resources.

Lots in this is informed by the enabling nature of the Code. For example:

  • Using user engagement to help inform what users want, what we can discontinue or deprioritise, and being transparent with analysis plans to convey what we’re doing.
  • Greater clarity and impact of communications to enable publications to be focused and streamlined.
  • Greater use of data sources where timeliness trades off against accuracy, or greater use of granular-level data where appropriate to provide useful new analysis that is fit for purpose for users’ needs.

We have had great support and advocacy for what we’re trying to do in Scotland from everyone in OSR, and it gives us confidence that how we’re innovating is in line with how the Code was designed. As Ed Humpherson said in his response to us on the priorities:

“We support your approach and there are several features that we regard as best practice, including the identification and communication of priorities for each analytical area; the involvement of users; and the openness about the potential for suspensions or changes to some of your current outputs… we certainly would not want to require you to keep all of your current range of statistical outputs if they were no longer aligning with user need”.

When Code 3.0 is finalised, all statistics producers should read it carefully and use it as a key to enable opportunities in the statistics they produce.

That’s the secret, after all!