Transparency: bringing the inside out

In our latest blog Director General for Regulation, Ed Humpherson, discusses the divergence between internal positivity and external scepticism about analysis in Government, and how transparency is key to benefitting the public good…

Seen from within Government, these are positive times for analysis. There is an analysis function, headed by Sir Ian Diamond, which continues to support great, high-profile analytical work. There are strong professions, including economists, statisticians, operational researchers and social researchers, each with strong methods and clear professional standards. There is an Evaluation Task Force, which is doing great things to raise the profile of evaluation of policy. And data and analysis are emphasised by Ministers and civil service leaders like never before – exemplified by the 2023 One Big Thing training event focused on use of data and analysis in Government.

Yet the perspective from outside Government is quite different. The Public Administration and Constitutional Affairs Select Committee has been undertaking an inquiry into Transforming the UK’s Statistical Evidence Base. Several witnesses from outside Government who’ve given evidence, and some of the written evidence that has been provided, highlights concerns about the availability of analysis and how it’s used. In particular, witnesses questioned whether it’s clear what evidence sources inform policy decisions.

What explains this divergence between internal positivity and external scepticism?

In my view, and as I said in my own evidence before the Committee, it all comes down to transparency. By this I mean: the way in which analysis, undertaken by officials to inform Ministers, is made available to external users.

This is highly relevant to the Committee’s inquiry. A key question within the inquiry is the way in which external users can access analysis undertaken within Government.

These questions are very relevant to us in OSR. We have developed the principle of Intelligent Transparency. You can read more here, but in essence, Intelligent Transparency is about ensuring that, when Government makes statements using numbers to explain a policy and its implementation, it should make the underlying analysis available for all to see.

As I explained to the Committee, we make interventions when we see this principle not being upheld – for example, here and here. When we step in departments always respond positively, and the analysts work with policy and communications colleagues to make the evidence available.

My basic proposition to the Committee was that the more Government can comply with this principle, the more the gap between the internal insight (there’s lots of good analysis) and the external perception (the analysis isn’t used or made available), will close. This commitment to transparency should be accompanied by openness – the  willingness to answer questions raised by users; and a willingness to acknowledge the inherent limitations and uncertainties within a dataset.

In terms of what we do at OSR, I wouldn’t see any point, or value, in us going upstream to consider the quality of all the analysis that circulates within Government.

Our role is about public accessibility and public confidence – not about an internal quality assurance mechanism for economics, operational research, social research and other types of analysis undertaken in Government. We are not auditors of specific numbers (ie a particular figure from within a statistical series) – something we have to reiterate from time to time when a specific number becomes the focus of political debate. Nor do we have the resources nor remit to do that. But we DO have both the capacity and framework to be able to support the appropriate, transparent release and communication of quantitative information.

This is the heartland of our work on statistics, and it’s completely applicable to, say, economic analysis of policy impacts, or evaluations of the impact of Government policy. There are good arrangements for the quality of economic analyses through the Government Economic Service (GES), and the quality of evaluations through the Evaluation Task Force (ETF); and similarly for the other disciplines that make up the Analysis Function. The ETF is a new kid on this particular block, and it is a great innovation, a new force for driving up the standards and openness of Government evaluations.

Where we add value is not in duplicating the GES, or ETF, or similar professional support structure within Government. Indeed, we already work in partnership with these sources of support and professional standards. Our expertise is in how this quantitative information is communicated in a way that can command public confidence.

In short, then, it really does come down to a question of transparency. As I said to the Committee, it’s like a garden in the early morning. Some of it is in the sunlight already, and some of it still in shade. Gradually, we are seeing more and more of the lawn come into the sunlight – as the reach of transparency grows to the benefit of the public.

Intelligent transparency – or how the sausage gets made

In our latest blog, Regulator Anna discusses our intelligent transparency campaign, and recently updated guidance.

There’s a line in the musical Hamilton which goes:

“No one really knows…How the sausage gets made. We just assume that it happens, but no one else is in the room where it happens.”

It’s about a meeting between three of the founding fathers of the United States to agree a political compromise. But it also reminds me of OSR’s intelligent transparency campaign. 

When you purchase a packet of sausages you might well want to know how they were made. What are the ingredients and where are they from? Who made the sausage, and did they follow rigorous food and hygiene standards? The answers to these questions might be easy to find, but it could be difficult or even impossible. And the answers might matter a great deal – the sausages could contain allergens which mean they should not be eaten by some people. 

When you hear a number being quoted by a Minister on the radio or see a figure used in a government press release, you may well have similar questions. Where does that number come from? How was it calculated and who did the calculation? Are there any warnings which should come with the number? Like with the sausages, the answers to your questions could matter a great deal. They could impact a decision you are going to make based on that number, like whether the bathing water quality at your local beach is safe for swimming today, where you will send your child to school, or who you are going to vote for at the next election.  

At OSR, we believe that you shouldn’t have to be in the room where it happens to have a good understanding of, and confidence in, the numbers used in the public domain by government. Government should make it easy for people to understand and scrutinise the data it uses to make decisions and inform the public. This is what is at the heart of our intelligent transparency campaign. 

To achieve intelligent transparency, government must: 

 

1. Ensure equality of access

Data used by government in the public domain should be made available to everyone in an accessible and timely way.   

2. Enhance understanding

Sources for figures should be cited and appropriate explanations of context, including strengths and limitations, should be communicated clearly alongside figures. 

3. Enable independent decision making and leadership

Decisions about the publication of statistics and data, such as content and timing, should be independent of political influence and policy processes. 

Our guidance on intelligent transparency, which was updated today, provides practical advice on how to implement these three principles. It highlights the role that all those working in government play in achieving this and now includes the following list of questions which you can ask yourself if you are using statistics and data publicly:  

  • Is the source for the figure in the public domain?
  • Are there limitations or caveats which impact how the figure should be used?
  • Is there context about the figure which impacts its interpretation?
  • Could this figure be misinterpreted or misused if taken out of context?
  • Would further support to ensure intelligent transparency is achieved be helpful?

Whether you are a producer or user of statistics, we would love to hear from you. You can get in touch with us for further advice and guidance, or if you are interested in working with us on our intelligent transparency campaign, via regulation@statistics.gov.uk. You can also keep up to date with all our work via our newsletter. Finally, if you are concerned about a lack of transparency in government use of data and statistics, you can raise a concern with us. 

What is intelligent transparency and how you can help?

Statistics Regulator Siobhan Tuohy-Smith discusses what we mean by intelligent transparency and how you can be an advocate for intelligent transparency across government and official data, statistics and wider analysis.

So what is intelligent transparency?

Everyone, I think, has a fairly clear idea of what transparency means. It means being open or clear – getting the data or statistics out there. But what do we mean when we talk about intelligent transparency?  

At its heart intelligent transparency is about proactively taking an open, clear and accessible approach to the release and use of data, statistics and wider analysis. As set out in our regulatory guidance on transparency, intelligent transparency is informed by three core principles: equality of access, enhancing understanding and analytical leadership. It’s about more than just getting the data out there. Intelligent transparency is about thinking about transparency from the outset of policy development, getting data and statistics out at the right time to support thinking and decisions on an issue, supporting the wider public need for information and presenting the data and statistics in a way that aids understanding and prevents misinterpretation. For example, the Welsh Government Chief Statistician posted a blog on understanding COVID-19 infection rates in Wales on 11 January 2022. This post went beyond just getting the data out there, by also aiding user understanding of the data to avoid misinterpretation. 

Why is transparency important?

For me, transparency is a key part of what we do at the Office for Statistics Regulation (OSR). It’s a theme the runs throughout the Code of Practice for Statistics, from practice Q2.3 about transparency about the methods used, to V2.1 about ensuring free and equal access to regular and ad hoc official statistics, to principle T3 about orderly release, to name but a few.  

Transparency is also a core part of ensuring data, statistics and wider analysis serve the public good. Only by seeing the numbers, and understanding where they came from, can we really understand what they mean and how to use them to best inform individual decisions and understanding of an issue. Individual decisions about where and when to buy a new house, mortgage decisions, and what school to send your child to. Or public understanding about COVID-19 infection rates or a new policy around climate change.   

We highlighted the need for intelligent transparency as a key theme in our recent State of the Statistics System report and it continues to recur as a theme in our casework.

What can you do to support intelligent transparency?

In OSR, we continue to champion intelligent transparency and equal access to data, statistics and wider analysis. We: 

  • Are building our evidence base, highlighting good examples and understanding more about barriers to transparency to help support those working across government to implement intelligent transparency. Today we have published some FAQs about intelligent transparency to help support this work. 
  • Engage with analysts, policy-makers and the communications function across government, and interested parties outside of government to advocate intelligent transparency and develop networks committed to intelligent transparency. 

But we recognise that this is not something we can do alone. We need your help! 

So what can you do: 

You can be an advocate for intelligent transparency across government and official data, statistics and wider analysis: 

  • As a user of this data, you can continue to question what you see and ask yourself does it make sense? Do you know where it comes from? Is it being used appropriately?  
  • If you are based in a department or a public body, you can champion intelligent transparency in your team, your department and your individual work. Build networks to promote our intelligent transparency guidance across all colleagues and senior leaders in your organisation. You can engage with users to understand what information it is they need to inform their work to inform the case for publishing it; get in touch with your Head of Profession or OSR if you experience issues publishing statistics, data or wider analysis of significant public value or interest 

You can get in touch with us via regulation@statistics.gov.uk if you are interested in working with us on intelligent transparency. You can also keep up to date with our work via our newsletter.  

You can raise concerns with us via regulation@statistics.gov.uk – our FAQs about how to raise an issue set out what to expect if you raise a concern with us.    

Looking ahead

We will continue to champion intelligent transparency and with your help, together, we can help intelligent transparency become the default for all government data, statistics and wider analysis.

 

Related Links:

Regulatory guidance for the transparent release and use of statistics

Intelligent Transparency FAQs

Exploring the value of statistics for the public

In OSR (Office for Statistics Regulation), we have a vision that statistics should serve the public good. This vision cannot be achieved without understanding how the public view and use statistics, and how they feel about the organisations that produce them. One piece of evidence that helps us know whether our vision is being fulfilled is the Public Confidence in Official Statistics (PCOS) survey.

The PCOS survey, which is conducted independently on behalf of the UK Statistics Authority, is designed to capture public attitudes towards official statistics. It explores trust in official statistics in Britain, including how these statistics are produced and used, and it offers useful insights into whether the public value official statistics.

When assessing the value of statistics in OSR, two of the key factors we consider are relevance to users and accessibility. The findings from PCOS 2021, which have been published this week, give much cause for celebration on these measures, while also raising important questions to explore further in our Research Programme on the Public Good of Statistics.

Do official statistics offer relevant insights?

PCOS 2021 shows that more people are using statistics from ONS (Office for National Statistics) now compared to the last publication (PCOS 2018). In the 2018 publication of the PCOS, 24% of respondents able to express a view said they had used ONS statistics, but this has now increased to 36%. This increase may be due to more people directly accessing statistics to answer questions they have about the COVID-19 pandemic. In our Research Programme, we are interested in knowing more about this pattern of results and also understanding why most people are not directly accessing ONS statistics.

Are official statistics accessible?

PCOS 2021 asked respondents if they think official statistics are easy to find, and if they think official statistics are easy to understand. These questions were designed to capture how accessible official statistics are perceived to be by members of the public. Most respondents able to express a view (64%) agreed they are easy to find. This is an important finding because statistics should be equally available to all, without barriers to access. Most respondents able to express a view (67%) also agreed that statistics were easy to understand, suggesting that two thirds of respondents feel they can understand the statistics they want to.

However, respondents who were aged 65 or older were least likely to agree with these two statements. Statistics serving the public good means the widest possible usage of statistics, so this is an important finding to explore further to ensure that older respondents are able to engage with statistics they are interested in. In our Research Programme, we will work to identify what barriers might be causing this effect and whether there are other groups who feel the same way too.

The value of statistics

Considering how the value of statistics can be upheld, respondents in PCOS 2021 were asked to what extent they agree with the statement “it is important for there to be a body such as the UK Statistics Authority to speak out against the misuse of statistics”. The majority (96%) of respondents able to express a view agreed with this statement, with a similar number (94%) agreeing that it is important to have a body who can ensure that official statistics are produced free from political interference. While we are cautious about putting too much weight on these two questions in the survey, these findings may at the very least indicate the public value the independent production of statistics, as well as challenges to the misuse of statistics.

In conclusion, PCOS 2021 suggests that statistics are relevant and accessible to many members of the public, but there are still some who do not access statistics or consider them easy to find or easy to understand. While the findings of PCOS 2021 offer a wealth of important information, and demonstrate the value of official statistics, it is clear there are still a lot of questions to explore in our Research Programme. We will continue our work to understand what statistics serving the public good means in practice, guided by knowledge from PCOS 2021.

Acting against the misuse of statistics is an international challenge

Acting against the misuse of statistics is an international challenge

That was the message of the event on 14 March hosted by the UK Statistics Authority on the prevention of misuse of statistics, which formed part of a wider campaign to celebrate the 30th anniversary of the UN fundamental principles.

My fellow panellists, Dominik Rozcrut and Steven Vale, and I discussed a range of topics, from addressing statistical literacy to regulation best practice, from memorable examples of misuse, to the cultural differences that affect public trust internationally. Although we all had different experiences and approaches, it was clear that there was a common passion for the truth and statistics that serve the public good.

The event had all the merits of on-line meetings that we’ve all become familiar with: lots of people able to join from a wide range of locations and lots of opportunities for people to engage using live chat functions.

Perhaps some people find it less intimidating to type a question into an app than to raise their hand in a crowded room, because there were lots of interesting questions asked and it was clear that the issue of preventing misuse of statistics generated a lot of interest and passion from the audience as well as the panellists

But the event also brought with it a new kind of frustration to me as a speaker: there were too many questions to answer in the time available and I felt bad that we couldn’t answer all the questions that people typed in.

So, in an attempt to rectify that, I’ve decided to use this blog to address the questions that were directly for me that I didn’t answer in real time, and those which touched on the key themes that came across during the Q&A.


“Who are the best allies in preventing misuse or building statistical literacy outside of stats offices? Are there any surprising allies?”

There are obvious allies for us, like Full Fact and the Royal Statistical Society.

I also like to give a shout out to the work of Sense about Science. Their work highlights that there is a huge amount of interest in evidence, data and statistics – and that a simple model of experts versus “the public” is far too simplistic.

There are a huge range of people who engage with evidence: teachers, community groups, people who represent patients, and a lot of others. These people, who want to find out the best evidence for their community, are fantastic allies.

And I’d also pick out a surprising ally: politicians. In our experience, politicians almost always are motivated to get it right, and not to misuse statistics, and they understand we are making the interventions we are making. So perhaps they are the ally that would most surprise people who look at our work.

“How important is statistical literacy among the media and general public in helping prevent the misuse of statistics?”

I think that having a sort of critical thinking skill is important. People should feel confident in the statistics that are published, but also feel confident that they know where to find the answers to any questions they have about them.

But equally, we need statistical producers to be better in how they communicate things like uncertainty, in a way that is meaningful for the public.

So rather than putting the responsibility of understanding solely on the user, and just talking about statistical literacy, let’s also talk about producers’ understanding – or literacy if you will – about public communication.

“You have mentioned that sometimes the stats get misinterpreted because of the way they are presented – can you share some examples?”

My point here was that misinterpretation is a consequence of what producers of statistics do. One example we’ve seen frequently during the pandemic concerns data on the impact of vaccines. It’s been the case that sometimes people have picked out individual numbers produced by public health bodies and highlighted them to argue their case about vaccines. Producers need to be alive to this risk and be more willing to caveat or re-present data to avoid this kind of misinterpretation.

“What are your views on framing statistics for example 5% mortality rate vs 95% survival rate? Both are correct but could be interpreted very differently.”

I find it impossible to answer this question without context, sorry! I definitely wouldn’t say that, as an absolute rule, one is right and the other is wrong. It depends on the question the statistician is seeking to inform. I can’t be more specific than that in this instance.

However, to avoid possible misinterpretation, we always recommend that producers use simple presentation, with clear communication about what the numbers do and do not say.

“How do we balance freedom of expression with the need to prevent the abuse and misuse of statistics?”

We don’t ban or prohibit people from using statistics, so in that sense there’s no barrier to freedom of expression. But we do want to protect the appropriate interpretation of statistics – so our interventions are always focused on clarifying what the statistics do and don’t say, and asking others to recognise and respect this. It’s certainly not about constraining anyone’s ability to express themselves.

“What’s the most damaging example of misuse of statistics that you’ve come across in your career?”

Here I don’t want to give a single example but give a type of misuse which really frustrates us. It’s when single figures are used as a piece of number theatre, but the underlying dataset from which the single figure is drawn are not available, so it’s not possible for the public to get to understand what sits behind the apparently impressive number. It happens a lot, and we are running a campaign, which we call Intelligent Transparency, to address it.

“Can you give us some more insight into how you steer clear of politics, media frenzies, and personalities?”

We always seek to make our intervention about clarifying the statistics, not about the arguments or policy debates that the statistics relate to. So we step in and say, “this is what the statistics actually say” and then we step out. And we don’t tour the news studios trying to get a big name for ourselves. It’s not our job to get media attention. We want the attention to be on what the statistics actually say.


I hope these answers are helpful, and add some context to the work we do to challenge the misuse of statistics. I also hope everyone reading this is going to follow the next series of events on the UN Fundamental Principles of Official Statistics.

The next round of events is moving on from preventing misuse, to focusing on the importance of using appropriate sources for statistics. Find out more about them on the UNECE website.

Transparency: How open communication helps statistics serve the public good

Over the past 18 months we’ve talked a lot about transparency. We’ve made public interventions such as our call for UK governments to provide more transparency around COVID data, and it’s been prominent in our vision for the future of analysis in government, including in our Statistical Leadership and State of Statistical System reports.

But what do we mean when we talk about transparency? Why do we care? And what can be done to support it?

What do we mean by transparency?

Transparency is about working in an open way. For us, transparency means being open about the data being used. Explaining what judgements have been made about data and methods, and why. Being clear about the strengths and limitations of data – including what they can tell us about the world, and what they can’t. It also means making sure data and associated explanations are easy to find and clearly presented. It is at the core of many of the practices outlined in the Code of Practice for Statistics.

Why does it matter?

The pandemic has increased the public appetite for data and drawn attention to the significance of data in decision making. Many of us will have become familiar with the phrase “data, not dates” – a phrase which UK government used as it set out its road map for easing coronavirus restrictions. In a context when so many have been asked to give up so much on the basis of data it is especially important that the data are understood and trusted. Transparency is essential to this.

Transparency supports informed decisions. Appropriate use of data is only possible when data and associated limitations are understood. We all make daily decisions based on our understanding of the world around us. Many of these are informed by data from governments, perhaps trying to understand the risk of visiting a relative or judging when to get fuel.

We also need this understanding to hold government to account. Clearly presented data on key issues can help experts and the public understand government actions. For example, whether the UK is taking appropriate action to tackle climate change? Or how effectively governments are managing supply chains?

Transparency gives us a shared understanding of evidence which supports decisions. It allows us to focus on addressing challenges and improving society, rather than argue about the provenance of data and what it means. It supports trust in governments and the decisions they make. It allows us to make better individual and collective decisions. Ultimately, it ensures that statistics can serve the public good.

What is government doing?

We have seen many impressive examples of governments across the UK publishing increasingly large volumes of near real time data in accessible ways. One of the most prominent being the coronavirus dashboard and equivalents in other parts of the UK, such as the Northern Ireland COVID-19 Dashboard.

It has become routine for data to be published alongside daily Downing Street briefings, and through its additional data and information workbook Scottish Government has put in place an approach which enables it to release data quickly when necessary. We have also seen examples of clear explanations of data and the implications of different choices, such as the Chief Statistician’s update on the share of people vaccinated in Wales.

However, this good practice is not universal. Transparency regularly features in our casework. We have written public letters on a range of topics including Levelling Up, fuel stocks, hospital admissions and travel lists. We want to see a universal commitment to transparency from all governments in the UK. This should apply to data quoted publicly or used to justify important government decisions. Where data are not already published, mechanisms need to be in place to make sure data can be published quickly.

The Ministerial Code supports this ambition by requiring UK Government ministers to be mindful of the Code of Practice for Statistics – a requirement that is also reflected in the Scottish and Welsh Ministerial Codes and the Northern Ireland Guidance for Ministers. In response to a recent Public Administration and Constitutional Affairs Committee report the UK Government itself said:

“The Government is committed to transparency and will endeavour to publish all statistics and underlying data when referenced publicly, in line with the Code of Practice for Official Statistics.”

What is OSR doing?

We want to see statistics serve the public good, with transparency supporting informed decisions and enabling people to hold government to account. Over coming months, we will:

  • build our evidence base, highlighting good examples and understanding more about barriers to transparency.
  • continue to intervene on specific cases where we deem it necessary, guided by the UK Statistics Authority’s interventions policy.
  • work with external organisations and officials in governments to support solutions and make the case for transparency.

What can you do?

We’re under no illusion: OSR can’t resolve this on our own. Whether an organisation or individual we need your help.

You can question the data you see. Does it make sense? Do you know where it comes from? Is it being used appropriately?

You can raise concerns with us via regulation@statistics.gov.uk – our FAQs set out what to expect if you raise a concern with us. We’d also love to hear from other organisations with an interest in transparency.

And you can keep up to date with our work via our newsletter.

 

 

Transparency is fundamental to trust – the government must learn from mistakes during the pandemic

As part of our work on statistical leadership, we are hosting a series of guest blogs. This blog is from Will Moy, Chief Executive of Full Fact.

 

This time last year, the UK was about to head into what would be the first of three national lockdowns. Already, the news was dominated by the coronavirus pandemic, shining a spotlight on data that shows no sign of fading any time soon.

In the past year, in-depth coverage and debate around the merits of statistical releases have become commonplace, while the nation has watched numerous public briefings filled with data and statistics from politicians and scientists.

As an organisation dedicated to fighting bad information, Full Fact knows just how valuable good information, used responsibly and communicated well, is to public life and public debate. We support the OSR’s vision that statistics should serve the public good.

It has, therefore, been heartening to see the new prominence given to discussions about the importance of data, statistics and evidence during the pandemic, whether among members of the public, in newspaper columns or in public statements from the government.

At the same time, the government needs to work to maintain public confidence in both its decisions and the data it is using to make them, which is a key part of ensuring it can tackle the crisis: this shapes what people are willing to do and how long they are willing to do it for.

We have all been asked to trust that the government is acting in our best interests, as our political leaders point to data and scientific evidence as justification for decisions that have had real and significant impacts on the way we live our lives.

In doing so, the need for a clear commitment to transparency, accountability and honesty in public life could not have been stronger. However, this bar was not always met.

Ministers or officials too often referred to data or statistics in statements to the press, the public or Parliament without publishing the full data so the public and intermediaries could check for themselves. A failure to be transparent on the data and evidence informing the government risks undermining public confidence in fundamental parts of the government’s response.

As fact checkers, we rely on publicly available information to scrutinise such statements. We can and do question departments but time is of the essence when trying to challenge false or misleading claims. By and large, a fact check is at its most valuable when it is published soon after the claim has been made, reducing the time any inaccuracy has to spread, and coming when the media or public are still interested in that topic or debate.

For instance, we were unable to assess claims about the turnaround times of tests made by Prime Minister Boris Johnson on 3 June 2020 until the necessary data was published a month later, on 2 July, and raised this issue with the OSR. Full Fact also found it impossible to assess a government target set out in July 2020 – to “test 150,000 at-risk people without symptoms per day by September” – because the right data wasn’t available at the start of October.

We discuss these examples, and more, in our Full Fact Report 2021, which is based on our experience fact checking the pandemic. In it we make recommendations for action from the government, including that when government departments, ministers or officials refer to data or information when making statements to the public, the media or Parliament, the full data must be made publicly available.

Throughout the pandemic Full Fact has been pleased to work alongside the OSR to help secure publication of missing or unpublished data, as well as challenging misuse of information – and both organisations are committed to continuing this work.

But publication of information in a timely manner should not require our interventions. The government must make a consistent commitment to doing so as soon as is possible, and before being asked.

It is undeniable that there are still many changes – some cultural, some technical, and many long-term – that need to be made within the statistical system. This will require efforts from statisticians at all levels and across all departments, and the OSR’s report into Statistical Leadership proposes a comprehensive set of recommendations towards this.

Crucially, the report highlights the pivotal role of committed leaders inside and outside the Government Statistical Service in ensuring government and society get the most trustworthy, high quality, and high value statistics. The report is in itself important and welcome leadership from the OSR: the recommendations should be promoted and acted on with progress monitored and evaluated.

Full Fact strongly agrees that efforts from statistical leaders, and the wider system, must be equalled by political leaders. It is of critical importance they lead by example, whether that is ensuring that statistical thinking is seen as essential within the civil service, or respecting the principles set out in the Code of Practice for Statistics.

Demonstrating respect for these responsibilities is essential if the public is to have confidence in the production and use of statistics in public life. The government must learn lessons from the pandemic and prove it is worthy of the public’s trust.