Going beyond consultation to creative conversation about the Code of Practice

In this blog, Penny Babb, Head of Policy and Standards at the Office for Statistics Regulation, discusses her experience of refreshing the Code of Practice for Statistics.


 

As we’ve recently released the third edition of the Code of Practice for Statistics, I am keen to reflect on the experiences that have brought us to this point. I would also like to acknowledge the support of the Royal Statistical Society, and the many statistical producers and stakeholders who have inputted their ideas and views about the Code – thank you! Your contributions have challenged our thinking and enriched Code 3.0.

The long and winding road

In October 2023, we kicked off a review of the Code of Practice and asked the question – is it time to refresh the Code of Practice? Following a wide range of engagement activities, by February 2024, the answer was clear: Code 2.1 had served us well – it was solid, trusted and respected. But in the light of the shifting data landscape, technological advancements and mis/disinformation challenges, it was time to think about how the Code could evolve to meet these challenges. So, we prepared a draft Code 3.0 and invited feedback through a formal consultation, which ran from October 2024 to February 2025. And here we are now in November 2025, after further refinement of the draft Code to address the feedback raised with us, with the latest edition.

Ongoing dialogue

Looking back to two years ago, when the Code 3.0 project kicked off, I’m amazed at how far we have come. We have had so many fruitful opportunities to hear from stakeholders across the statistical system and wider communities to inform our thinking. Our understanding about the Code has evolved through our regulatory work and through hearing about others’ experience and perspectives. In fact, some changes we’ve made to the Code are the product of engagement we’ve undertaken since the second edition was published in February 2018.

One example of how our thinking has been shaped by those using the Code is the development of the Code Principles. These largely stem from the work of the ONS sustainable development goal (SDG) team. In 2019 the team needed a way to test non-official data sources for use in monitoring some indicators. Their work led to our development of a set of universal principles which were relevant to any analyst. In turn, Code 3.0 builds on these ideas to unpack TQV in 10 principles which can be applied by any analyst.

Active engagement

Engagement is not a purpose but a means to establish purpose. Active engagement was core to the development of Code 3.0. ‘You said / we did’ is an approach that is often used in response to consultations. We used it ourselves after the Code consultation to summarise what we heard and to give a feel for how we were planning to then act. But as a way of reporting engagement, it misses out the nuance and depth of the exchange of insight.

By focusing on engagement through listening and responding, we have been able to establish an interactive and iterative process rather than a one-off sharing of ideas. We saw feedback not just as a series of edits to be made but as many insightful points to be carefully considered within task and finish groups in OSR.

Open to check and challenge

An important element of establishing any dialogue is the exchange of understanding and respect. Listening is critical for this to occur, as is a degree of empathy. But for the dialogue to be successful, it requires all parties to be open to hearing from each other. Central to that is seeing yourself as accountable for your decisions and actions and being courageous enough to invite criticism as you determine how best to meet your responsibilities.

We have embedded an accountability framework within our Code 3.0 package, which was developed after input from David Caplan, who has been an important stakeholder throughout our Code development. David has worked for 20 years in the GSS, was director of research and analytics at the Audit Commission, and we are happy to hear is the incoming Honorary Officer for Public Statistics at the RSS. At a Code/RSS event, David shared his thinking about an accountability model based on his previous experiences at the Audit Commission. This inspired us to take our framework forward.

Our accountability framework highlights that you must be willing to hear critical feedback: to give an account, be held to account and make good. Accountability relies on you first seeing yourself as accountable to others – it is an attitude. Once you have accepted this frame of mind, you can then find ways to make yourself accountable.

I love the simplicity of the accountability framework, although I recognise that following its principles is perhaps easier said than done. Unless you are willing to hear what others have to say, to reflect on it and then to set out what you are doing in response – whether you can or can’t, agree or disagree – you will be working in a bubble, isolated from hearing about ways you may need to evolve and grow. This kind of approach isn’t something new in OSR – we always aim to work to the standards we demand of others. In fact, thinking about and enhancing our own TQV will feature more prominently in our upcoming new three-year OSR strategy.

Call to action (and listening)

So, as you check out Code 3.0, see how it can focus your thinking on what matters in your own context. Make yourself accountable to your colleagues and stakeholders – invite and listen to feedback, be open to check and challenge, and be frank about your decisions and the information that’s informed them.



Related articles

What is it about TQV?

OSR launches a refreshed Code of Practice for Statistics: Embedding Trustworthiness, Quality and Value (TQV)

How official statistics producers need to embrace vulnerability in a time of crisis: ISI World Statistics Conference 2025 keynote speech

Director General, Ed Humpherson, recently gave the keynote speech at the ISI World Statistics Conference about how official statistics producers need to embrace vulnerability in a time of crisis. Watch the full keynote speech, or read the transcript below.

“Those of you who have been sitting here for a while and looking at the stage might have noticed that I’ve brought something with me, or a pair of things with me, in fact. You might think of them perhaps as being a prop for my talk, or perhaps you might think of them as being a gimmick. What I’d really like you to think of them, though, is perhaps a symbol, a token, a representation of openness, and a vulnerability. Or even think of them as a gesture by me as your speaker this morning to be open enough to leave a pair of my shoes on the table.

Because that’s what I’m going to talk about today. I’m going to talk about openness and vulnerability and how central they are to you, to us, and our identity as people who care about statistics. That’s my main point. I’m going to unpack it through the lens of various ways of looking at the world various lenses but talking about other things as well.

I’ve been struck, and I’m sure many of you as well have over the course of this week, in amongst all the great positivity about developments that statisticians around the world are putting into place, a kind of an anxiety. An anxiety that we live in times which are difficult for people who work in official statistics. I won’t enumerate all of the examples that might lead us to that conclusion, but you will have heard people talk about the threat of misinformation, political manipulation and political interference, quality challenges.

And it has struck me that people here from an official statistics background might themselves be feeling vulnerable in the face of this poly-crisis, this multiple series of crises. What I want to persuade you of is that actually we should not let any crisis go to waste, and here is an opportunity for people who produce official statistics to demonstrate our secret sauce. Our secret sauce is openness and vulnerability. That’s the point of this talk.

A picture of a pair of shoes outside a house

But I’m going to begin somewhere else. I’m going to begin on the streets of the town I live in, Kingston upon Thames in South London. About four weeks ago, I was walking along near my house and I saw these shoes on a little wall outside somebody’s house, I thought looking quite forlorn, in fact, a rather sad pair of shoes on their own. And my first thought was that somebody had lost them, and they had been placed there in case the parent came back to pick them, pick them up. But then I realised that as I looked around, there were other examples in nearby streets of people leaving things out outside their house. And what they’re doing is, they’re leaving things outside their house for other people to collect.

So here are three examples there are people who say, I’ve got some stuff in my house, I don’t need it, but somebody might. So I don’t know if you can see, but there’s a box here, it says ‘free’ it’s a very sort of casually presented box, but it’s got some stuff in it people might like. And here, a little basket of scarves. Somebody’s obviously getting rid of their scarves. Somebody might like them. It’s a phenomenon of free exchange. And one of my thoughts about this has been, a few years ago, I don’t think I saw this very much. I don’t think I saw people putting things outside on the walls of their house, from outside their house, for other people to take.

You also see this phenomenon on the internet – free-exchange websites where people put things for other people to take. I thought that was interesting. But I put that thought away and went back to my day job. And then I started to realise there was something connected to our work, which I’ll come on to.

But what I want to say today is this: I will start with the evolving nature of trust. Then I want to talk about official statistics serving the public good, and then my organisation’s role, and then I want to talk about the crises and how we’ve responded, and I want to close by really driving home this point about vulnerability.


The evolving nature of trust

The evolving nature of trust. I’ve been reading a lot about trust recently. The literature has revolved around a paper written in 1995 by an American academic called Mayer. In Mayer’s model of trust, he says: for person A to trust organisation B, they need to perceive that organisation B has competence, integrity and benevolence. So that’s the standard. It’s called the CIB model: competence, integrity, benevolence. What’s interesting is how understanding of trust has evolved.

More recently, sociologists and psychologists who’ve looked into this say that over time, of those three, benevolence is becoming more important. That instead of trusting somebody because of their competence, or their integrity, you most need to trust that they are benevolent towards you. That they share your interests and they act in your interests.

The second element of this evolution is the observation that trust is shifting from being something which people look upwards to institutions to provide – the state, the church, a charity perhaps, into something which is more relational, more horizontal, more equal. It’s to do with interpersonal relations of shared equality, not of hierarchy. So these are two ways of thinking about trust. One is a shift away from just competence and integrity towards benevolence being more important. And the second is this shift from institutional and vertical to horizontal and relational.

The ‘shoes outside the house’ is an illustration of that, because I’m pretty certain that 15 years ago, the same people would’ve given their unwanted things to a church or to a charity to sell. They’d have gone upwards to an institution, whereas now they’re bypassing, disintermediating the institution. I also think it’s quite a vulnerable thing to do. There’s some vulnerability about putting something personal of yours out for everybody to see. So that’s the evolving nature of trust that’s going to come back in this talk in a few minutes.


Official statistics and the public good

Hold that thought while I just say a little bit about official statistics and the public good. And I have to make an apology here. The more I’ve been here this week and heard other people talk – I’ve heard Walter Radermacher say this, I’ve heard Steve McFeely say this – that maybe there’s something about the term ‘official statistics’ that isn’t quite right, that maybe that sounds too austere and too distant, too remote. And maybe we should talk about public statistics. And in the UK, the Royal Statistical Society has done a very nice body of work promoting the idea of public statistics. And I hope that in future years, if I was to do a talk again, that’s what I’d say. I’m using the term official statistics now because that’s a term that we commonly use. But just to say, I acknowledge the limitations of the term official statistics.

Anyway, official statistics should serve the public good. This is the definition we use in the UK. I won’t read it out, but it emphasises that they are public assets.

Statistics serve the public good as public assets that provide insight, which allows them to be used widely for informing understanding and shaping action.

To fully realise the potential of statistics, the organisations producing them should  place serving the public good at the heart of their work and be conscious of their responsibilities to society.

They are for the public. They’re about the public, and that is what enables them to be used widely for informing, understanding. And moreover, my organisation – which I’ll explain in a second what we do – we believe that to fully realise the potential, organisations, NSOs, official statistics producers must place the public good at the heart of their work. That is essential.

In my view many official statistics producers convey their authority as if they’re a central bank. This is a photograph of the Bank of England. It’s a very powerful, robust facade. It emphasises its national nature, there’s a flag flying on top, it’s got strong pillars. Everything about it is conveying solidity and reliability, and confidence. It’s all competence and integrity. I think official statistics producers often convey to their societies their role, their authority, in this way. They say: We have very strong methods, we have very strong data ethics, we have tremendous frameworks for statistical disclosure control. We have competence, we have integrity. What I worry about is, if I’m right that the nature of trust is shifting, then competence and integrity aren’t enough.

The far strong facade of the Bank of England, the strong facade of an NSO, does not convey warmth or benevolence. It does not convey relationship. And I think that could be a growing challenge. Lots of the things which have happened in different ways around statistics might have something to do with this, as well as the more obvious causes. So I think benevolence and openness are really very important.


The Office for Statistics Regulation

So, what’s my organisation? We’re rather unusual in the global official statistics community. Not unique – as I’ve learned very well this week from Martine Durand at the French Statistics Authority, which does very similar work to us, but we’re certainly unusual.

We are a separate body and we are responsible for setting the UK’s Code of Practice for reviewing compliance with that Code of Practice. And when we review compliance, I mean, we really review it because if we find statistics that are not compliant, we remove their accreditation as official statistics, which is quite a strong intervention. We act as a catalyst for systemic improvements. We do reports on things like data sharing or on cross-UK comparability so that a user could compare statistics in Scotland with those in England and Wales and Northern Ireland.

Perhaps what we may be most known for is that we are willing to stand up for the use of statistics – the appropriate use of statistics in the public domain. What that means is that we will sometimes make public statements about the ways in which statistics may be being misinterpreted or being misleading for the public. I think that’s maybe relatively unusual in the world of official statistics. But if you believe everything that I’ve said already, you can see how important it is to serve the public in this way.

So that’s what we do. Everything we do is built around three lenses on statistics which we call TQV: Trustworthiness, Quality and Value. Trustworthiness is about the organisation that does the producing, and I suppose this is closest to the competence and integrity part of trust. It’s about the organisation’s practices, its commitments, the ways in which it manages itself. Quality is about the data, how data are collected, how then they’re converted into aggregate statistics, the strengths and limitations, the uncertainties that surround those statistics. Value is about what the statistics mean for users in society.

So we always use this framework. Everything we look at will have a T and a Q and a V component. We think looking at any one of those in isolation will tend to only give a partial picture. So that’s TQV.

For today’s purposes, what’s so interesting about TQV is it’s really all about openness. Trustworthiness is all about being open to recognise your errors. If you find an error as a statistics producer, you don’t hide it away and think that’s embarrassing. You notify your users promptly, openly, clearly, because that is you doing your job. You always recognise the limitations and imperfections of the statistics that you produce. And you should not do that just in a very technical way, presenting your confidence intervals, for example. You do that in a way that helps a user know what they should or should not infer from the statistics. Openness to uncertainty, understanding, conveying that statistics are always an imperfect measure of what they’re trying to capture. Being honest and clear about that uncertainty. Openness to users, always having an open door to understand what users want, how they use statistics. And finally, and I think this is probably underplayed often, it’s openness to dialogue. People will challenge the statistics that are produced by statistics bodies. They will see a flaw in them. They will say, hang on, I’ve delved into your underlying data, or I’ve looked at your metadata, and something isn’t right here. Welcome that – that is absolutely fantastic. That is your best friend in terms of quality control. Be open to dialogue, debate, different voices and different perspectives. And if you’re open in all of those ways, you will be demonstrating vulnerability. You’ll be opening yourself up. You will metaphorically be putting your shoes on the table, or be putting your shoes on the street. That’s the secret sauce. That’s what TQV is all about.


Three challenges that have faced UK official statistics

So the three challenges that I want to talk about, which I think illustrate these themes in practice. The first is the way in which statistics have become weaponised in political discourse. By weaponised, I mean, instead of statistics providing an insight to a user, they get used in political debate as a way of repeating the same number over and over again, almost until it loses all its meaning as a way of sort of hammering home a simple and often misleading message. That’s the weaponisation of statistics.

Second, we know that we live in a world where users demand real-time or as close to real-time data as they can. That’s a difficult demand for statistics producers to meet.

Third, and very specifically in the UK, I’m going to talk about the challenges which have faced and are facing economic statistics, which Roeland alluded to earlier.

So, weaponisation. I’ve got a picture here of the most famous weaponised statistic in recent British political history. This is to do with the Brexit referendum. There was a statistic pasted on the side of a bus. My organisation, OSR, we got involved in highlighting the ways in which that might be misleading.

For a long time as a result of that experience, I thought that our job was to highlight something which was wrong and say it was wrong, and it was only subsequently that I realised there was a much bigger and more important story around this whole thing, which is in the phrase ‘that’s not my GDP’. For those of you who don’t know it, this is the story, it’s a real story. During the debates around whether Britain should vote to leave the European Union or not, an economist was presenting to a town hall meeting in the north of England and he said, well, one of the things about Brexit is you might want to think about the impact on GDP. And he said, ‘GDP has been rising, maybe we don’t want to endanger that’. And a lady in the audience put her hand up and said, ‘that’s your bloody GDP, it’s not mine’. And what is meant by that is, that’s kind of a big abstract number. You know, I don’t eat GDP. I don’t clothe my children in GDP. My children’s shoes are not GDP. I want to know about something which is meaningful to me.

I’ve always thought that is such a great way of illustrating the ways in which statistics, when they’re weaponised, can become remote from the people that they’re about. If people say those are not my numbers, I think we’re in trouble. So, over the succession of things I’ve listed there, starting with the UK’s contribution to the European Union, and then claims about educational performance, and then the COVID pandemic, particularly debates around vaccines, and more recently, debates about migration and debates in our 2024 general election. I’ve shifted my view from being it’s our job to highlight what might be called ‘lying scoundrels’. Frankly, I don’t think that anymore. I think it’s our job to represent the people who might not see the numbers being relevant to them. That’s the way to respond to weaponisation of statistics.


Excess demand of real time data

This little screenshot I’ve got here is of a fantastic tool that the UK Department of Health produced, which was a real-time COVID dashboard, updated daily, which gave real-time picture of infections, hospitalisations, deaths and vaccine rollout when vaccines came. You could visit it, you could click down to an incredibly low geographical area. Updated daily. It was incredibly popular. At its peak, it had about 70 million hits a day, which is an astonishing number for a single government website on a single day.

We responded to this. You might say that a statistics regulator would, might, suck our teeth about all of this dirty data going out. You know, where’s the quality assurance on that? But in fact we really supported it. We did some rapid reviews. We allowed them to release it not at the standard time of 9:30, but at 4:00 PM, because that was the time when the daily government press statement came out. We wanted to make sure that the information was available to everybody at the same time. We continually focused on the transparency of government evidence. And, we said, you know, there’s a code of practice, which we obviously love, but the most important thing are the principles of the Code. And as long as the people producing this kind of dashboard followed the principles, perhaps the detail of compliance matters less.

The final kind of crisis is the one which is, I guess, upon us now, which is the quality of UK economic statistics. And there are many ways of telling this story. I’m going to focus in on survey response rates, which I guess has been sort of the most proximate public reason that people have become concerned about economic statistics. I have two charts here. The upper one shows that the UK has always been towards the bottom end of response rates for Labour Force Survey statistics. The UK is the heavy black line; the other lines are other European countries. The lower chart is a different survey, the Living Cost and Food Survey, showing – because it’s too small, you won’t be able to see the time period, but really, in the period more recently after the pandemic – a really significant drop off in the response rates to that survey, rendering it increasingly unrepresentative. That’s the problem: the lack of representativeness.

Our response to this process of declining quality is a different kind of openness. Maybe it’s a painful kind of openness. It’s the openness which says, these statistics are no longer as reliable as they were. As a result, we have removed the official statistics designation from several statistics produced by the Office for National Statistics. I’ve listed them there. We also did a thematic review looking at economic statistics in full earlier this year, which said: there are really significant problems, and the first step for ONS to get its head around these problems is for ONS to fully recognise them as problems and then set out plans to respond to them. It’s no good just to say, don’t worry, we’ve got a plan, nothing to see here. Open up and say there are real problems. And the great news is the ONS has done that. That’s my bottom box here. There’s an excellent recovery plan now, which fully recognises the issues, and is showing the openness which has been the theme of my whole talk.

So what have I learned from these challenges? In a sense, what’s my vulnerability that I can share with you? The first is I made a mistake earlier on in doing this job. I thought we were campaigners to defend the truth. I no longer think that. I don’t think that’s the right way to think about statistics, because it sort of puts statistics on a pedestal they can’t possibly ever sustain. What we are is we’re advocates of a kind of transparency in the use of statistics that enables an intelligent user to understand what they mean. That’s what we do. When we make an intervention, when we write to a minister, when we make a statement in the media, that’s what we’re doing. Intelligent transparency. It’s not this sort of fact-checking, truth-correction role. I’ve learned that.

The second thing I’ve learned is that in times of high demand, an organisation like mine can be a real pain in the ass. We can say, you know, you need to do things in the right way and in the right order. And I think it’s really essential that we demonstrate flexibility, that we support producers in their responses. And then the third thing is, I talked about the succession of problems with economic statistics. I honestly think we could have highlighted the problems more clearly. I think everything that has happened, we said, you know, we picked it out as a problem. I’m not sure if we did it quite loudly enough. And that, that is one of my reflections and one of the things I say in the mirror in the morning is maybe I could have done a better job there.

The key point though, through all of this is, I think what I’m trying to get across is that the technocratic distance of being ‘we are the official statistics body, we have the best statistics’. That won’t help any of you who are OS producers. What will help is openness and vulnerability.


Openness and vulnerability in practice

So I just want to have my final section to say what I think that means in practice. The first thing I think it means is, you should be on the front line, you should be out there, you should be engaging with users. There are actually some great examples that the ONS in the UK is doing to encourage citizens to respond, to see if response rates can be increased by direct and appealing engagements. The two screenshots I’ve got are of two approaches. ‘It starts with you’, is the tagline. I think it’s rather nice and although I’m not sure whether the evidence is fully in, I understand that the early signs are, it really does help. So the lesson is: be on the frontline, engage, go out there and engage. Whilst doing that, recognise, although I’ve used the word public, of course, that’s a really silly thing to say. There’s not a public; there are multiple publics. There are different people in different places, in different communities who have different interests.

And just to illustrate that, we produced a piece of work, my organisation, earlier this year called ‘Statistics in Personal Decision Making’, where we went out to both a survey panel, but also some focus groups, and asked people, individuals, how they use statistics in their daily life. And it was completely fascinating. The answer is sometimes not at all, and sometimes much more than you’d expect. But that kind of insight of saying there’s a public out there, which is not really very well served by our current notion of users being sort of experts, is really important. So recognising not a single public.

And then three things to avoid. Avoid distance. If you’re in an OS producer and you ever think like, well, we’re conveying ourselves as this authoritative, ‘best producer of statistics that there are, you can trust us because we have all of this competence’. Well, you’re probably going down the wrong path. Associated with that is don’t ever fall into the trap of thinking the statistics you produce are perfect, that they’re a perfect single point. I always get really worried when ONS will publish a statistic and some expert user will say, I think it’s flawed here. And the ONS response can sometimes be ‘this is the best available estimate, or this is using the best available methods’. I always think that’s so dismissive. Don’t find yourself saying that. It’s a mistake.

And the final one. I always wanted to be the last person in the world to have an AI component to my talks because I don’t know anything about it, basically. But then I realised I can’t really do that – we’re in the world where everybody has to talk about AI. So this is my little nod to there being an AI thing out there.

I think there’s a tendency in the data and the AI world to regard data as a block of stuff that you shovel in. And I love this image of a man shovelling coal into a steam engine. It summarises the way AI is conceptualised. I believe there’s a tendency in the big data and the AI world to think of statistics being just another thing that you shovel in and then some magic happens inside the furnace. I think we should be really wary of that, because of course, we all know that the richness comes from the insight, from the metadata, from the limitations as much as from the strengths.

So that’s my attempt to say something about AI! I don’t think it’s very convincing, but I thought I better put it in because everybody talks about AI in their presentations these days.


Conclusion

So here is my conclusion.

If trust is shifting from institutional to relational,

and the key to trust is now benevolence,

then official statistics producers need to avoid technocratic distance

and instead they must demonstrate openness and vulnerability.

That’s my argument. And just to put it another way, visually: if you find yourself thinking like a central bank, all pillars and solidity: don’t. Don’t think like that. Instead, think of this: the shoes in the street.

Thank you.”

What is it about TQV?

Helen, our Head of Development and Impact at the Office for Statistics Regulation (OSR), reflects on why the principles of the Code have endured and flourished, even as the social and technical landscape has changed around them.


OSR has recently refreshed its Code of Practice for Statistics, which sets the standards for the production and onward communication of official statistics across government.

Code 3.0, the new Code, has evolved in several important ways since the previous edition, reflecting substantial changes in the data landscape. These include an increased demand for statistics from users, and developments in the ways that statistics are produced and disseminated. However, the core principles of the Code – Trustworthiness, Quality and Value (TQV) – remain, providing the framework for a Code that ensures statistics serve the public good.

The principles of TQV are simple:

Trustworthiness is about building confidence in the people and organisations that produce statistics and data.

Quality is about using data and methods that produce assured statistics.

Value is about ensuring statistics support society’s needs for information.

Since the TQV framework was introduced in Code 2.0, it has become firmly embedded across the statistical system. This includes within OSR – forgive me if I’m wrong, but I’m not sure any other government analytical standards have made it onto a team-leaver’s gift hoodie!

 

Hoodie with TQV logo

 

So, what is it about TQV that lends itself to this persistence (and devotion)? I have a three-part theory:

1. TQV is intuitive

Given common definitions and understanding of these concepts, it’s hard to imagine that many people would consider trustworthiness, quality and value bad things to see in statistics, or indeed to see in most things in life! Committing to uphold the standards of Trustworthiness, Quality and Value when producing and communicating statistics should mean, for an end user, that the numbers they’re using to inform their understanding and decisions are reliable tools to understand the world. Would anyone ask for the contrary?

2. TQV is universal

TQV provides an ethical framework that can support the publication of any kind of analysis or evidence. It is relevant for, and can be used by, anyone in any organisation who wants to ensure that the information they provide is relevant, informative and impactful. TQV does not prescribe a particular way of producing or presenting statistics; instead, the focus is on building public confidence and delivering public good, which allows for and encourages the adoption of new and innovative approaches. And TQV remains robust in the face of threats to the public good, helping organisations to demonstrate their commitment to public interest and to mitigate concerns about misinformation.

3. TQV is easy to remember!

This sounds frivolous, but it’s important. I’ve worked as a government statistician for 5 years and a regulator of statistics for 6, and I still don’t know every practice of the Code off by heart. There is probably a statistician out there that does, but I haven’t met them yet. However, I can remember TQV. My understanding of what these concepts mean and the outcomes they can support, even at a high level, has consistently enabled me to make informed decisions and act, advise and (I hope) inspire others to act in a way that is consistent with working towards serving the public good. I think others out there, in many different roles relating to data and statistics, could say the same.

So, there you have it – my theory on why TQV is so powerful. We’d love to hear your own views and experiences of using the refreshed Code. If you’d like to get in touch, please email regulation@statistics.gov.uk.

Trust in Statistics: Launching the Refreshed Code of Practice

OSR asked CEO of the Royal Statistical Society(RSS), Dr Sarah Cumbers, to reflect on the refreshed Code of Practice in a guest blog. The RSS works closely to support OSR’s regulatory work, including partnering with us on the TQV (Trustworthiness, Quality and Value) annual award for voluntary application, that celebrates organisations adopting these principles in their daily practice for everyone to see.

The Code is the essential foundation for our statistical system, underpinning the public trust society relies on by clearly communicating what matters when working with data, and driving up standards.

The RSS has been closely involved in the Code’s review process. We were pleased to host two dedicated roundtables with the OSR, one in November 2023 and another in late 2024, to give our members an opportunity to share their views and engage with OSR on how the Code could evolve.

It’s genuinely encouraging to see how many of the issues raised in those sessions are reflected in the revised Code, including the need for stronger user engagement. This direct line from member input to policy change shows the strength and influence of the statistical community when we speak together. We also submitted a formal response to the consultation, underlining our call for users to be placed at the heart of decision-making.

The Code is much more than a set of standards on paper; it’s a guide to best practice that supports better decision-making across the UK.

Independence is a central theme. In a period when the relationship between statisticians and politicians is under scrutiny globally, the Code’s emphasis on protecting impartial professional judgement is crucial. The integrity of official statistics depends on statisticians being able to work freely and professionally, and the Code provides a vital safeguard for us in the UK.

There’s always more work to do to ensure statistics fully serve the public good, and the RSS is particularly keen to see further progress on user engagement. The revised Code’s call to put users at the centre of the system provides a valuable catalyst for this, and we are looking forward to working with OSR, ONS and the wider government statistical service to discuss and agree exactly what that should mean in practice across the statistical system.

The Code matters because, above all, trust in statistics matters. The refreshed Code is a great step forward, and the RSS looks forward to supporting its use and continuing this conversation with both the OSR and our members.


Related

Code of Practice for Statistics

Code of Practice for Statistics 3.0 – What has changed

Quality Data, Shared Purpose: World Statistics Day 2025 and the Refreshed Code of Practice

In our latest guest blog, Rochelle Tractenberg explores how ethical statistics and the refreshed Code of Practice can help build public trust this World Statistics Day…

Every five years, World Statistics Day celebrates statistics’ and data science’s global contributions to evidence-informed decisions, democratic accountability, human dignity and flourishing, and sustainable development.

The theme of this World Statistics Day 2025 is “quality statistics and data for everyone”, which coincides with the Office for Statistics Regulation (OSR)’s refreshed Code of Practice for Statistics. The timing and orientation of the refreshed Code both highlight its place in promoting ethical statistics practice to help achieve this goal for the people of the UK.

While high-quality and widely available statistics and data certainly exist and are to be celebrated, these don’t just happen; they require diligence, care and competence at all levels by professionals in statistics and data science. World Statistics Day 2025 presents an opportunity to consider how we can increase the visibility of this commitment and work, and their accessibility and utility for everyone.

An under-appreciated challenge to public trust in official statistics and the statistical profession is “drift” – which refers to changes in the properties of data over time. Drift can occur in data source, associated meaning, and ability to accurately represent a concept. Drift can mean that data become less reliable or less consistent over time, impacting the value and trustworthiness of the data. As such, regularly reviewing statistics to see if they meet relevant standards – and critically, if they do not – is crucial to promoting, and in some cases renewing, trust.

I was excited to hear about the refreshed Code of Practice for Statistics – the guiding framework for all official statistics in the UK – which will be live on the Code website from 3 November 2025.

OSR has updated the Code to include broader support for anyone working with or communicating statistics, and to reflect technological advances. The Code’s core principles – Trustworthiness, Quality and Value – have not changed, but its guidelines have been made clearer and more relevant. All official statistics in the UK must meet the requirements of the Code to ensure that they serve the public good. Those that do so are granted accredited official statistics status, indicating that the statistics, and the underlying data, are of high quality. The Code is also useful for those working with data and statistics who want to voluntarily apply it as a practical framework to increase public confidence in statistical work. It can help anyone build public trust in statistics and data science.

Engagement with the Code is worthwhile at the start of, and regularly throughout, data collection and analysis, for anyone who wishes to more actively and transparently step onto the path towards ethical statistical practice. To see my full reflections on building trust in statistics and data science through ethical practices, and how the Code can contribute, please see my recent article. For additional perspectives on the importance of World Statistics Day, please see the International Statistics Institute’s statement.


Rochelle E. Tractenberg is a tenured professor at Georgetown University (Washington, DC). Her applied ethics work focuses on strengthening trustworthiness in statistics and data science across research and policy settings. A biostatistician since 1997, she serves on the UK National Statistician’s Data Ethics Advisory Committee (NSDEC), ISI Advisory Board on Ethics, and the Association of Computing Machinery Committee on Professional Ethics. She has written two books on ethical practice, and has contributed to standards for statistics, data science, and mathematics – as well as the forthcoming UN guide, Ethics in Official Statistics: Foundations and practices to foster public trust.


Related:

Quality statistics and data for everyone: Renewing trust through ethical practice

Why migration statistics still matter

In our latest blog Statistics Regulator Gillian Fairbairn discusses the complexities of tracking migration and the importance of migration statistics…

What do Canada geese, wildebeest and humpback whales all have in common?

They migrate. Every year, Africa’s wildebeest herds travel across the continent in search of greener pastures, humpback whales move from one end of the planet to the other between feeding and breeding grounds, and Canada geese migrate south every year to warmer climates during the winter months. For these animals, migration reflects a move towards resources and away from threats.

We also see migration of people. Whilst most people will live close to where they were raised, some will move across the world, seeking out employment, education or personal connections, or simply exploring the world we live in. Others move away from difficult circumstances, political instability, war or a lack of resources.

All of this moving around doesn’t just impact the individual migrants. It also affects the areas being migrated to and from. Increases in population through immigration may increase demand for local services, for example, healthcare or education, as well as increasing local economic activity. Conversely, decreases in the population through emigration may reduce the demand and staff available to local services or have economic impacts, such as changing the local skills mix. As a result, official statistics on migration attract a lot of attention and are often covered by the media and discussed by government officials. As such, the regulation of these statistics is a priority at the Office for Statistics Regulation (OSR).

Regulating migration statistics

Back in 2022, OSR published its review of migration statistics produced by the Office for National Statistics (ONS). At the time, ONS, the producer of long-term international migration estimates was in the early stages of its plans to change the data sources for its long-term international migration statistics. Whilst ONS had plans to move towards estimating migration using administrative data , the COVID-19 pandemic brought a temporary stop to the International Passenger Survey (IPS) which was the main data source for migration estimates at the time. This accelerated the urgency of moving to a different data source to meet a pressing user need for high quality migration estimates. Our report highlighted ONS’s credible and robust approach to developing migration estimates. But it also noted that improvements should be made in how ONS engages with users, and it should ensure clear and coherent communication of its plans for migration statistics. Since we published our report, ONS has made significant progress towards meeting our recommendations, particularly with its user engagement. However, it still faces challenges in the production of migration estimates, particularly around revisions and communicating uncertainty. In December 2023, we published a follow up to our earlier review. The follow-up report recognised ONS’s progress and closed most of the recommendations that we set out in our initial review.

ONS revises its estimates after publication to reflect both where assumptions are updated with actual travel data and additional methodological developments. Some of these revisions have been significant and, in line with our expectations, ONS is working to understand and explain them to users. ONS is also working to address concerns with measuring the migration of British nationals, which is currently reliant on the IPS. The IPS arrivals survey was terminated last year so ONS is exploring alternative data sources for measuring these migrants.

Casework on migration statistics

Alongside ONS’s long-term international migration estimates, the Home Office produces a wide range of statistics describing the operation of the UK’s immigration system, which include statistics on people arriving in the UK through irregular migration, asylum and visa applications. We support and engage with the Home Office regularly, particularly through our casework function, where we respond to any queries or concerns raised by anyone about the production or use of statistics, including organisations and members of the public.

A theme in our casework relating to Home Office statistics has been their transparency. For example, figures were sometimes being used in the public domain by government officials ahead of publication, which falls short of our expectations in line with the Code of Practice for Statistics where we would expect statistics to be released in an open and transparent manner that promotes public confidence. Following a constructive letter exchange between our Director General and the Permanent Secretary at the Home Office, we have seen improvements to practice, with statistics being communicated in a clear and transparent way and interim official statistics publications used where appropriate.

Migration is an important topic that is at the centre of lots of public debate, and it is important that statistics are used to get under the surface of complex policy questions. But sometimes we see instances where the Home Office statistics are used incorrectly or unclearly in public debate, which can lead to misunderstanding or misrepresentation of the data. In addition to regulating the production of statistics, we use our voice to stand up for the appropriate use of statistics. So, for example, when concerns were raised with us that ministers used incorrect asylum backlog statistics, we investigated and publicly responded to stand up for the correct use of statistics in the public domain.

Final thoughts

Combined, ONS and the Home Office are providing users with a wealth of information on who is moving in and out of the UK and why. Whilst OSR recognises the good work carried out by these official statistics producers, it is OSR’s role to ensure that these official statistics are of sufficient trustworthiness, quality and value and that they are used appropriately to inform public debate. In addition to our publications, our statistics regulators engage regularly with statistics producers to provide ongoing support and guidance, often before any issues arise. This direct engagement enables us to have a positive impact on official statistics.

Tracking the migration of any species has its challenges. Imagine trying to attach a tracking device to a humpback whale which can cover thousands of kilometres of remote and inaccessible habitats. Perhaps the expectation is that measuring human migration should be more straightforward. However, complex visa systems, entry routes and unexpected political events can create significant challenges that might encourage even the most knowledgeable migration statisticians to go fishing instead!

Advancing data linkage: key takeaways from the Administrative Data Research UK conference

Helen Miller-Bakewell, OSR’s Head of Development and Impact, reflects on the Administrative Data Research UK (ADR UK) 2025 Conference, and the takeaway messages for her and the Office for Statistics Regulation (OSR).

Recently I attended the ADR UK conference in Wales, From records to research: Harnessing administrative data to enhance lives.

It was an impressive event, with over 250 presentations across four themes, which spanned insights from research using linked administrative data; technical and social aspects of data access and linkage; and ethics and governance.

It was hugely stimulating, and, as I tried to order my thoughts on the train home, four key takeaways emerged for me:

1. Linked data are a powerful tool to help evaluate impact and inform positive action

There were so many presentations about research findings from the analysis of linked administrative data that have obvious relevance to current and future government policies. I personally heard presentations on topics as varied as the impact of Scout and Guide attendance on cognitive ability trajectories in childhood and adult health; patterns and predictors in A-level science choices in Wales; and the effects of Daylight Saving Time clock changes on health outcomes in England. But there were a huge number of other presentations, some related to health and education, as well as others relating to crime, economics and the environment.

It was exciting to see the step being made from ‘interesting analysis’ to ‘analysis with tangible impact’. Practical demonstration of the impact that data can have will doubtless remain important to secure future buy-in (and funding!) for the continued use of linked administrative data. As others noted at the conference, the current cross-cutting Government Missions offer researchers a clear opportunity to demonstrate this practical impact.

2. Data are not just numbers

Across the conference, there was widespread recognition that data are not just numbers. Rather, the numbers in data records represent individual people, sometimes at challenging points in their lives. It was encouraging and inspiring to hear about the variety of ways in which those working in the administrative data space are engaging with members of the public and involving them in decisions – at all stages of the research process, and in discussions around data access and governance.

As we become ever more ambitious about what is possible with data, the words of the Rt Hon Mark Drakeford MS, who gave an excellent keynote talk, hit home: “Taking the public with us is not something we can take for granted”. Public engagement on topics relating to data and statistics will become increasingly important.

3. We’ve come so far – but there’s still a long way to go

Those presenting at the conference recognised and celebrated the huge amount that’s been achieved so far in terms of enabling secure administrative data research. But several talks and workshops also highlighted new challenges ahead – such as being able to link data held in different secure data environments for single research projects; using secure data to train AI models; and the extraction and onward use of these trained models from secure research environments.

These ambitions raise new ethical and technical challenges – in some cases highlighting existing challenges that still need to be solved, such as reducing the time it takes to agree even ‘simple’ data access requests and conduct output reviews.

4. We still need to work on how to link data

Among the presentations I went to, few talked about the challenges associated with actually linking data. In Q&A sessions, however, challenges, and limitations in the scope of research or research findings resulting from them, did sometimes come out.

Performing high-quality data linkage can still be very challenging. So, it’s important to keep going with things like the agreement and adoption of data standards and metadata, as well as the development of linkage methodologies. I heard an interesting presentation from the Office for National Statistics on its Reference Data Management Framework, which highlighted for me how much work is still going on to try to make linkage easier and better.


Mark Dakeford on the ADR UK conference 2025
Mark Drakeford delivers an opening address at the ADR UK conference 2025

What does this mean for OSR?

I left the conference interested and enthused, wondering how OSR can best provide support in all these areas. Two key things came to mind:

  1. The refreshed Code of Practice for Statistics, due for release later this autumn, will include several practices that relate directly to data sharing and linkage. Together these practices aim to make official statistics based on linked data more common, and to make data used in official statistics more suitable and more available for linking by others as well. They touch on process, technical and social factors. ​Including these practices in the Code will give OSR a firm basis to catalyse data sharing and linkage across the UK statistical system.
  2. We will continue to work with existing partners, such as with ADR UK and PEDRI, which unites different UK organisations working with data and statistics, to improve how we all work with the public. And we’ll remain open to new partnerships in this space – especially where these have potential to help government data be made more widely available to accredited researchers beyond government.

Alongside these activities, we’ll continue to use our platform to champion greater data linkage, in a secure way, in research and statistics. Because, ultimately, this is an increasingly evidenced way to help data live up to their potential to serve the public good.

We want to hear from you

OSR is always delighted to hear about and champion work that demonstrates or enables effective data sharing, access and or linkage. If you have a case study or would like to discuss our work in this area, please get in touch: regulation@statistics.gov.uk.


Related:

A reason to be optimistic: sharing and linking data on road traffic collisions – February 2025

How government can make more data available for research and statistics in 2025 – January 2025

Data sharing and linkage for the public good: breaking down barriers – September 2024

Efficiency in Action: Improving Productivity in Public Services

In our latest blog, Head of OSR Ed Humpherson discusses measuring public services productivity in a more sophisticated way.

Efficiency reviews are a recurring feature of public administration in the UK. HM Treasury runs spending review processes that typically require departments to consider how they might become more efficient. Governments have also periodically commissioned reviews from senior business people, tasking them with finding efficiency savings. Examples include the 2004 Gershon review and the 2010 Green review.  

These reviews usually consider efficiency in a rounded way – recognising that efficiency can emerge either as a reduction in costs to deliver the same level of output or the delivery of a greater level of output for a given cost. 

But the reviews often are remembered for their focus purely on cost cutting. This is partly because of brutal fiscal arithmetic: when money is scarce, the Treasury needs to impose cost reductions. It is also often because the headlines that emerge from the reviews focus on the opportunities to cut costs. When then-Prime Minister David Cameron welcomed the Green review in 2010, he said the report ‘goes into how much money we have wasted…over the last decade’. Similarly, the Gershon review adopted a very direct focus on cost cutting through its headline: “Releasing Resources to the front line”. 

Perhaps cost cutting makes for a simpler message than the more complex “efficiency can be more outputs for the same cost” story. 

But in the long run, focusing on cost provides little insight into whether public services are becoming more or less productive. On this, the macro picture appears to be disappointing, with evidence pointing to slow growth in public sector productivity. For example, this piece by EY highlights the difference between private and public sector productivity. 

To do that properly, we need measures which relate the resources used to what’s delivered – or to put it in economic terms, measures which relate inputs to outputs. 

That’s what some new work by the Office for National Statistics (ONS) focuses on – how to measure public services productivity in a more sophisticated way. 

The traditional way to count the contribution of public sector output in many of the world’s economics is an “inputs equal outputs” approach – that is, what’s produced should be valued at the cost of production. If society puts £30 billion into education, then society gets £30 billion of value back. The trouble with this approach is that it fails to account for any increases or decreases in public sector productivity – whether resources being put in are being put to better or worse use – and this is a topic of clear public interest: are public servants delivering increasing value over time? 

For this reason, since the 2000s, the ONS has sought to measure public sector contribution by better valuing the outputs produced, both in terms of the number of outputs delivered and how these have increased in quality over time: so, for example, if the NHS delivers more procedures over time, has the average patient outcome also improved as a result of these procedures?  

This quality dimension is really important: looking at education, ministers rarely give interviews about the number of students sitting exams, but they regularly focus on whether those students earn better results. Delivering public services is rarely simply about the number of outputs: quality dominates reporting and political debate.  

This approach sets the ONS apart from many other national statistical institutes, and the direct measurement of public sector activity is one factor in why the UK’s GDP levels fell further than other countries’ during the COVID-19 pandemic – see the OECD/ONS report: International comparisons of the measurement of non-market output during the COVID-19 pandemic | OECD 

For many countries, at least in the quarterly estimates, the value of output is equal to the input; if teachers are being paid, then that is the value of output. But for the UK, the ONS was more realistic: if schools were closed with some provision being made online, then teachers were generally producing less education, even though they were likely still being paid their usual salary. Overall, COVID-19 restrictions meant education output fell, and the ONS approach reflected this. 

The ONS has now built on this approach through its public sector productivity review. This review extends the approach of valuing outputs from health and education to other sectors of government activity, including welfare administration, tax collection and public safety.  

The ONS figures show a fascinating picture – one where, for example, NHS productivity has fallen since the pandemic in the light of more-challenging workloads, but welfare administration has improved its productivity, reflecting the introduction of universal credit as a single benefit to replace multiple benefits. This is a classic productivity-improving policy when, broadly, the same output (the value of benefits paid) requires fewer inputs to administer.  

Measuring quality improvements for public sector outputs is inherently tricky. It is hard enough for statisticians to measure quality improvements for products that are bought and sold in markets, like mobile phones and computers, and there is a long tradition of methodological challenges in doing so. These challenges are all the greater for public sector outputs where, typically, there are no traded markets, and where prices cannot reveal the way consumers value the product. 

Putting that inherent difficulty to one side, though, for me, there are three benefits to the ONS approach: 

  • First, it is good to see ONS being a world leader in methodological developments.  
  • Second, the ONS productivity figures tell a more plausible story than a simple “inputs equal outputs” approach. The decline in NHS productivity seems to confirm what many other indicators tell us – a point made very well by the Institute for Government’s Gemma Tetlow in a recent appearance at the Treasury Select Committee (Q34 in this transcript).  
  • Third, and most importantly, this ONS work could pave the way for a more rounded approach to public sector productivity which gets away from the crude arithmetic of focusing on cutting expenditure. 

Let me unpack this third point further. What might a “more rounded approach” involve? The ONS figures can provide a solid base of measurement that enables government, Parliament and others, such as the National Audit Office, to track progress through time. In allowing a through-time focus, it might be possible to avoid the periodic upheaval of major reviews which have punctuated the recent past of public administration.  

And this approach might also, by its consistent nature, move away from regarding transformation as a special project, something separate from the day-to-day delivery. It might instead lead to a view of increasing efficiency and changing services as part of an ongoing and iterative process of improvement. This echoes recent evidence I presented at the Public Administration and Constitutional Affairs Select Committee hearings on the UK Statistics Authority. I was talking about the approach to transforming statistical production in the ONS and the risk that core activities can get neglected within a high-profile drive for transformative change. I explained the risk as follows: 

Transformation is a bit special and gets all the attention; the core is seen as less exciting. … the strong message is, “Do not have separate, shiny projects and programmes off to one side; align continuous improvement to the core and do it iteratively”.’ (Q153 in this transcript). 

What is a fair diagnosis of the challenges the ONS has faced maybe also provides insight at the macro level of government as a whole. Perhaps the best approaches to improvement and efficiency are iterative and embedded within ongoing delivery. 

From Trend to Tool: Elevating Dashboards with Trustworthiness, Quality and Value

The world of statistics is not immune to the rise and fall of fashionable trends. Every few years a shiny new toy emerges that will solve all of our problems…or so we think.  

One example of such a trend is dashboards. Brought to public fame during the Covid pandemic, the dashboard aims to provide information in a clear and easy-to-use format. The success of the UKHSA Covid dashboard in conveying data to the public during the pandemic demonstrated how effective this method can be for data dissemination.  

Dashboards also have other benefits, such as acting as good early warning indicators that show data changing as part of a wider picture. I like to imagine this much like the control panel in the cockpit of a plane where each button and light serves a specific function, allowing the pilots to address issues and ensure a safe and efficient flight. 

However, it is important to remember that dashboards also have some drawbacks. For example, the format often makes it harder to communicate uncertainty and can result in complex data being oversimplified. 

Whilst the initial hype of dashboards has diminished somewhat, especially now that there is a surge in interest in AI, it is clear that they will continue to be used across the statistical system. Over the past few years OSR has provided advice to official statistics producers on dashboards. We have now developed this further and produced our own dashboard guidance, aim to ensure that dashboards are produced to the highest possible standard.  

How to get the most out of dashboards 

A dashboard can be a great addition to a statistical release alongside a bulletin or article, in order to give users bespoke access to data and to help them visualise trends.   

User needs should be put at the forefront of any decision about whether a dashboard is needed and for how long it will be updated and maintained. Our ‘dashboard questions’ target the most prominent issues and challenges with dashboards and are intended to help statistics producers to consider the value of a dashboard for any statistical output. 

It is important that statisticians and analysts work closely with dashboard developers and data visualisation teams to ensure that the final product is produced to a high standard.  

Visual space can be limited within dashboards, and so there is a balance to consider between the level of information that should be included upfront and not overwhelming the user. To help with this, we have developed a three-layered communication framework that we encourage producers to adopt which is outlined in our guidance. 

Dashboards are also often used to bring together multiple sources of data into one place. A dashboard can be a useful communication tool where there is a clear user need for a dashboard as a product. Understanding how the statistics are used, and by whom, is therefore important. 

The importance of TQV 

Many of you will be familiar with the principles of the Code of Practice for Statistics: Trustworthiness, Quality and Value (TQV). We consider that these principles are universal and can be applied to almost any circumstance, including dashboards.   

Dashboards should be trustworthy (T) and present high-quality statistics (Q) that are valuable to users (V). Our new guidance, in line with TQV, is intended to support statistics producers in their development and use of dashboards. It doesn’t have all of the answers, nor cover all scenarios that producers may encounter when a dashboard is requested – but it does set out key principles that producers should adopt when making a public-facing official statistics dashboard.  

We want to encourage producers to think about the code and its pillars of Trustworthiness, Quality and Value when they weigh up the advantages, disadvantages and risks of using a dashboard to disseminate data. This, combined with data visualisation guidance from the Government Analysis Function, should give better clarity on best practice. 

 

We would like to thank everyone who provided their views and feedback that helped to shape this work. We are open to feedback on this guidance so please get in touch with us if you would like to discuss via regulation@statistics.gov.uk. 


Related:

Regulatory guidance: Dashboards

OSR’s emerging strategic themes for 2026-29

This blog sets out how our discussions with stakeholders, the recent focus on the performance of the Office for National Statistics (ONS) and the resulting evidence sessions at the Public Administration and Constitutional Affairs Select Committee (PACAC) have led us to reflect on our regulatory approach.

The core of our work at OSR is assessing whether individual statistical outputs comply with the Code of Practice for Statistics. ONS, which produces some of the most important statistics in the UK, has been experiencing a period of significant challenge. A key element of our role, as the statistics regulator, is providing support and challenge to ONS as it works to recover its economic statistics.

However, ONS is just one producer of the official statistics that we assess. We are responsible for providing assurance across the whole statistical system across the UK, which includes around 800 accredited official statistics. Only 15 per cent of these are produced by ONS. Our State of the Statistical System report, which we published in July, highlighted our conclusion that the system as a whole remains robust.

We have had conversations with a wide range of stakeholders about our role and our approach as we develop our new strategy, and PACAC has added an important and highly influential voice to our thinking. We have also discussed these emerging ideas with the UK Statistics Authority Board’s Regulation Committee, which provides strategic guidance and oversight to OSR.

The key elements of our proposed strategy, which will run from 2026 to 2029, are that:

  • we should continue to act as a rigorous regulator
  • we should take a systemic perspective and identify opportunities for system-wide improvement
  • we need to enhance our work to support the integrity of evidence in public debate

The Code of Practice at the heart of our work

The core of our regulatory model is setting clear standards for official statistics through the Code of Practice for Statistics, and forming judgements as to whether individual sets of statistics comply with the Code. We are currently updating the Code, with plans to publish a new version later in 2025. The new version will set clearer expectations on what statistics producers must do; will contain a better-defined statement of the framework of TQV (Trustworthiness, Quality and Value); and, for the first time, will include standards for the public use of statistics, data and wider analysis (what we call “intelligent transparency”).

We will use this new Code to underpin our judgements on whether statistics merit the status of accredited official statistics (AOS). These judgements are then reviewed and approved by the Regulation Committee, which provides crucial input into regulatory decision-making.

We take a balanced stance in forming our judgements. We don’t see our core purpose as being to criticise or undermine producers; in fact, we often highlight positive developments in statistics. Doing so can be just as effective as exposing weaknesses – because the endorsement empowers teams to continue to improve and innovate, and because it provides an evidence base of good practice from which others can draw.

We have assessed the designation of a wide range of statistics: including for example ONS’s statistics from the Crime Survey for England and Wales, the Census statistics (in separate reports covering Northern Ireland, Scotland and England and Wales) and, most recently, Northern Ireland’s public transport statistics.

But equally, we are willing to identify when statistics do not meet the standards of the Code of Practice. We have done this for several key statistics, for example migration statistics (2019), employment statistics (2023), the Wealth and Assets Survey (2025) and, most recently, National Records Scotland’s healthy life expectancy statistics.

Rigorous regulator

So what lessons are there for us to learn from stakeholder feedback, including the PACAC sessions?

Our first reflection is that we have made sound and appropriate decisions. These include withdrawing the status of accredited official statistics where appropriate.

As a result, we want to continue to act as a rigorous regulator, with the Code of Practice as our guiding light. By “rigorous”, we do not mean being harsh or critical as a performative stance. We mean that we form judgements through a clear and thorough consideration of the evidence, and that we will articulate our judgement on compliance with the Code in a clear and accessible way. The importance of clarity is something the Regulation Committee has emphasised to us in its oversight of OSR. So being a rigorous regulator does not mean that we will retreat from highlighting effective work. This recognition remains an important lever for us.

But we need to do more to make sure that the rigour of our judgements is clear in our publications. It also means that the requirements that we set in our reports should be specific and actionable. We have already made significant changes here. After we commissioned a review of our work from Professor Patrick Sturgis, we tightened the specificity of our requirements and the follow up. For example, in our systemic review of ONS’s economic statistics published in April 2025, we required immediate action from ONS: the publication of a survey recovery plan and an overarching plan for economic statistics. ONS published both on June 26th, alongside the Devereux review. And our report Admin-based Population Estimates for England and Wales, published in July 2024, also required an action plan from ONS within 3 months, which ONS duly published in October 2024.

This all points to a need for OSR to continue to be a rigorous regulator, by:

  • putting our judgements at the heart of our publications, press statements and other communications
  • setting clear requirements and showing clearly how we follow up on them
  • making it easier to see a running list of our outstanding recommendations

System catalyst

We don’t just look at individual sets of statistics. We look at the whole system of statistics, produced by people based in UK Government departments and agencies in Scotland, Northern Ireland and Wales.

Our coverage of the system is reflected in our regulatory work and our State of the Statistical System (SoSS) reports. The reports in 2021 and 2022 recognised – and in the 2022 report, celebrated – the statistical system’s response to the COVID-19 pandemic. By 2023 we were sounding a more concerned note – highlighting that resources were increasingly under pressure; that transformations of statistics should not come at the expense of quality; and the growing risk from declining survey response rates. By July 2024, our SoSS report was describing a system under strain, pointing out that the decline in response rates to household surveys was becoming critical and that there was a need to consider whether economic statistics were meeting user needs.

Our 2025 report recognises the quality challenges facing ONS, alongside other producers encountering challenges with survey responses. But this report also emphasises that the system as a whole remains robust.

The 2025 report also describes opportunities to enhance the production, communication and ultimately the value of official statistics, through a continued focus on improvement. This improvement focus includes exploring data linkage and the potential use of AI in official statistics.

In terms of our systemic impact, one of our most notable achievements has been to embed the recognition of TQV across the UK Government analytical system. This framework encourages statisticians and analysts to think not just of the data they are collecting but about giving assurance on their approach to producing them and also focusing directly on how users will want to use the statistics.

These concepts are not just a luxury. They are the underpinnings of public confidence in statistics, and they can offer a compass for statistics producers as they adapt their statistical estate to new demands.

We therefore want to continue to support the statistical system in the UK, both by highlighting key risks and identifying opportunities to adapt, innovate and improve.

Champion of evidence integrity

It is often remarked that we live in an age of misinformation and data abundance, in which public confidence in official sources of evidence may have eroded significantly.

OSR plays a role in addressing this environment of declining trust. The Code of Practice is itself a bulwark against the challenge of poor information. The Code’s new standards for the use of statistics set a new bar for how government makes information available publicly (formally called the Standards for the Public Use of Statistics, Data and Wider Analysis). We also have a casework function. Anyone can write to us voicing their concerns about statistics being potentially misrepresented or misused, and we will seek to clarify the appropriate interpretation of the statistics. And alongside our regulatory work, we conduct research to better our understanding of how statistics serve the public good.

But we are not the only organisation that addresses these risks, and official information is only part of the overall information ecosystem. So, we need to work with a range of other organisations to help support the integrity of evidence, such as the Royal Statistical Society, Admin Data Research UK and Sense about Science. And through our voluntary application approach, we support organisations who apply the principles of TQV beyond official statistics – contributing to evidence integrity more broadly.

Looking ahead

We see three strategic imperatives for OSR arising out of the current circumstances:

  • demonstrating that we are a rigorous regulator in our judgements and our communication of those judgements
  • being a catalyst for systemic improvement across the statistics system
  • continuing to support the integrity of evidence in public debate (through our work on intelligent transparency and our casework)

We will start to refine our strategy based on these three drivers, and propose a fully worked-up strategy to the Regulation Committee in autumn 2025.

As we flesh this strategy out, we would very much welcome feedback. Do these sound like the right themes? Have we identified the right lessons? What might we be missing? If you have any thoughts, please do get in touch with us via email regulation@Statistics.gov.uk