Skills for Care’s journey to becoming producers of an accredited official statistic

In our latest guest blog Natalie Fleming, Analysis team leader in Workforce Intelligence at Skills for Care, talks about her experience and learning from the process of achieving accredited official statistics status for an official statistic. Natalie joined Skill for Care’s Workforce Intelligence team as a data analyst in January 2022 after working in the NHS. Skills for Care is an official statistics producer with one accredited official statistic and plans for more in future.

Recently, Skills for Care underwent an assessment by the Office for Statistics Regulation (OSR) that resulted in the accreditation of its workforce statistics.  I’ve personally learnt a lot during this assessment process; it’s changed the way we work as a team and as a wider organisation.

Skills for Care manages the Adult Social Care Workforce Data Set (ASC-WDS) and produces reports using data from it on behalf of the Department of Health and Social Care. We are in a relatively unique position as an independent organisation, not a government body, unlike many other official statistics producers like NHS England or the Office for National Statistics (ONS). Our users include colleagues in government, local authorities, the Care Quality Commission (CQC), academics and care providers. Due to the importance of the ASC-WDS data, it is vital that we demonstrate our findings are robust and reliable so that our users can have the highest levels of confidence.

Skills for Care’s journey to becoming an official statistics producer began in 2018, when the ‘Personal Social Services: Staff of Social Services Departments’ report began a transition of ownership from NHS Digital (now part of NHS England) to Skills for Care. In 2021, it was renamed to ‘The workforce employed by adult social services departments in England’. You can find the latest report from February 2026 on our website. This report kick-started the official statistics assessment process, which has ultimately impacted the way the team collects, analyses and creates all of our outputs.

The first step was working with OSR colleagues in 2022 to formally acknowledge our voluntary adoption of elements of the Code of Practice for Statistics, namely the three core principles of Trustworthiness, Quality and Value. Then in 2023, Skills for Care was added to The Official Statistics Order 2023, making us a producer of official statistics – hooray! The third and final step came during 2024 and 2025: this was the assessment process to determine whether the ‘The workforce employed by adult social services departments in England’ report could become an accredited official statistic. The report finally became an accredited official statistic in February 2025 – a huge achievement both for the team and for the wider organisation.

A key area of improvement for my Workforce Intelligence team was accessibility – in every sense of the word. First, we’ve adopted a new written report template which is better suited to screen readers and has helpful features for users with accessibility needs. Second, we have worked over the last few years to improve the designs of our data visualisations. This means users can access a vast amount of data in a clear visual way. This improvement is highly valued by our users, who often prefer the visualisations to written analysis or spreadsheets. Third, we’ve published a full methodology paper describing how we create our workforce estimates on our methodology webpage. This is an improvement on our old page, which was much more light-touch, and now offers the full details of the methodology. This new document won’t interest all our users, but for those detail-orientated readers it gives a much greater understanding and insight into our processes.

Lastly, in October 2025 Skills for Care launched our new and improved Workforce Intelligence website. The previous version was several years old and was originally designed for a much smaller portfolio of reports. As the product owner of the website redevelopment project, I oversaw user research with many users from different backgrounds across the sector. From these sessions, new user-led and then user-tested designs were developed. One aspect of the new site I’m particularly proud of is the improved accessibility features. The wording has been simplified and streamlined on every page, and the navigation menu altered to enhance the visibility of all parts of the website. We’ve received feedback that this has made a really positive difference for our users.

Since gaining official statistics producer status, other teams across Skills for Care have also improved their ways of working and processes when using our data. My team works collaboratively with colleagues in the Policy, Comms and Marketing teams to ensure findings are reported accurately and consistently when promoting new publications. This continues throughout the year via bulletins, newsletters and on social media.

Skills for Care has fully embraced being an official statistics producer. The organisation does a lot more than just data analysis, but the lessons learnt from our user research sessions and documentation spurred on from the OSR assessment process are driving other changes. Our official statistics producer status is helping to drive our internal data strategy, and more conversations are taking place about use of data in different teams. The Marketing team is also looking to upgrade Skills for Care’s other websites and using user research information and examples from the Workforce Intelligence website as part of this. This will ultimately mean all users and partners of Skills for Care will have an improved experience, and it’s lovely to see our hard work being championed by other teams across the organisation.

Beyond public letters: the hidden impact of casework

In this blog, Elise Rohan, Head of Casework, shines a light on the often unseen but highly impactful work of OSR’s casework function. Casework is the part of OSR’s role that investigates concerns raised about statistics from members of the public, Parliamentarians, journalists, academics and others. As she prepares to step away from the role, Elise reflects on recent private interventions and the lessons they’ve taught her about the importance of collaborative relationships.

 

One of the things I’ve learned in this role is that the most meaningful casework impact rarely comes from headlines generated by public letters – it comes from the behind-the-scenes conversations with producers. Our impact is greatest when we work together proactively rather than reactively.

When people think of OSR, they often picture public letters. But the reality is that most of our interventions happen behind closed doors. We are proportionate in our interventions and recognise that public letters are not always the best route to change.

Take our recent work with the Department for Health and Social Care (DHSC), where we were asked to look at its communications that compared the dangers of sunbeds with smoking. While smoking and sunbeds are both classified by the International Agency for Research on Cancer as Group 1 carcinogens, there isn’t evidence to suggest they pose equal levels of risk. Following our engagement, DHSC was quick to update its press release and social media posts to make this distinction clear and has committed to no longer using this comparison.

We also recently reviewed statements made by different ministers relating to the condition of flood defences. Verifying these claims requires data from several different sources across the Department for Environment Food and Rural Affairs (Defra) and the Environment Agency, meaning the claims weren’t sufficiently transparent and accessible to the public. In response to our engagement, the Permanent Secretary for Defra wrote to OSR agreed to improve the accessibility of this information and committed to incorporate and consolidate asset condition statistics, as part of its work to improve the existing official statistics on flood defence expenditure.

Our hidden interventions also extend beyond government. Earlier this year, the Greater London Authority (GLA) published some analysis on recent trends in tourism data following our engagement with them concerning claims by the Mayor for London on American tourism in London where the basis for the claims was unclear.

In each case, whether we intervene publicly or privately, we stand up for statistics.

As I prepare to pass on the Head of Casework baton, I remain grateful for the open and constructive engagement I have received from producers and organisations I have worked with over the years. These individual cases and interactions have enabled us to bring transparency of statistics into focus and ensure statistics can play their role in democratic debate.

Casework will keep evolving, but I am confident that OSR will continue to work in partnership with others to ensure steps are taken to prevent the misuse of statistics and uphold the vision that statistics serve the public good.

Don’t Ask for Trust in Statistics. Earn It.

In our latest guest blog, leading British statistician, David Spiegelhalter explores why trustworthiness—not trust—should be at the heart of statistical communication. Drawing on the influence of Baroness Onora O’Neill and reflecting on the updated Code of Practice for Statistics 3.0, he argues for intelligent transparency, honest communication, and a commitment to helping the public genuinely understand evidence. He also shares why it’s not enough for statistics to be trustworthy—they must be engaging too.

I give many talks to all sorts of audiences, from health professionals to business executives to attendees at book festivals. And, perhaps surprisingly, the Code of Practice for Statistics features in almost all of them (I do exclude school students from my propaganda).

This all comes from my obsession with the ideas of Baroness Onora O’Neill. She is a top philosopher, specialising in Kant, and she presented the Reith Lectures on A Question of Trust in 2002 – I still value the excellent book of her lectures. She was brilliant at distilling years of thought into short and clear statements, and one of these has had a huge influence on me, both professionally and personally.

In this age of misinformation and scepticism of authority, a repeated question is ‘how can we improve trust in science/institutions/public health etc?’. To which O’Neill replies, that’s the wrong question. Rather than trying to manipulate people into trusting us, we should be earning that trust by demonstrating trustworthiness. This is such a simple idea, presumably based on Kant’s idea of duty ethics (although I’ve never read any Kant), which places the responsibility firmly on the authority.

When I introduce this idea in a talk, many people in the audience take pictures of the slide, so I know it must be good. I then go on to show the Trustworthiness – Quality – Value (TQV) framework of the Code of Practice, showing Trustworthiness as the first pillar, although emphasising how important the Q and V are too. I feel I am channelling Baroness O’Neill.

I have recently had to update my slides with the new Code of Practice for Statistics 3.0. This rightly keeps to the basic core principles of TQV, which continue to form the basis for standards for official statistics. But I have been delighted to see the introduction of Standards for the Public Use of Statistics, Data and Wider Analysis. These focus on the way that statistics are communicated and used in public life, and are rooted in the idea of ‘intelligent transparency’ – incidentally another term introduced by Onora O’Neill. This includes equality of access and independence, but also enhancing understanding, which is my main interest.

Back in the pandemic in 2020, a group of us became very frustrated at the amount of frankly untrustworthy numbers being bandied around, by both politicians and commentators, so we tried to list what we thought were the vital components of trustworthy communication of evidence. Nature published our rant as a commentary, with our five points being essentially:

  1. Inform and not persuade
  2. Balance (but not false balance)
  3. Acknowledge uncertainty
  4. Be upfront about the quality of the evidence
  5. Pre-empt misunderstandings

These later got incorporated into the Government Communication Service RESIST 2 Counter-Disinformation toolkit.

The Code of Practice 3.0 contains the essence of these principles for trustworthy communication, for example saying:

  • Do present and use data and statistics objectively, being impartial and professional
  • Do clearly describe the quality of data and statistics, including uncertainty and bias in estimates and impacts on appropriate interpretation and use
  • Do not use statistics, data or wider analysis in a misleading way. This includes not cherry-picking figures, taking figures out of context or placing undue certainty on them.
  • Take proactive steps to prevent or minimise the risk of misinterpretation or misuse.

I feel particularly strongly about the last point. It’s not enough to suggest what the statistics mean, it is also vital to say what they do not This could be thought of as pre-empting misunderstanding, but also could pre-bunk misinformation – getting in there early before false claims start circulating.

There is one final issue that is not in the Code. When communicating, I believe that there is little point in being trustworthy if you are dull. While the information should not be trying to persuade people to think or do anything, I do feel that it is fine to try and persuade audiences to be interested – to engage in the evidence so that they can be better informed.

So I have a small suggestion for Code of Practice 4.0: don’t be dull.


Related

Code of Practice for Statistics

Mental health statistics across the UK: What we know so far and what’s next

OSR Regulator for our Health domain, Sarah looks at mental health statistics across the UK, with OSR’s findings so far and what’s next for statistics across England, Wales and Northern Ireland.

Understanding the mental health of people across the United Kingdom is essential for ensuring that government policies and health resources are targeted effectively. Back in 2019, the Office for Statistics Regulation (OSR) began a major piece of work looking at mental health statistics across the UK. This was initially prompted by increasing public interest in mental health. Much has happened since we started this work, including most noticeably the COVID-19 pandemic. More than ever now, high-quality data and statistics on mental health are vital for understanding people’s experiences, identifying gaps in services and planning improvements.

Since we started this work, we have published reviews of mental health statistics in both England and Northern Ireland, as well as a follow-up for England and blogs from producers in both England and Northern Ireland. Most recently, in January this year, we published our review of mental health statistics in Wales.

Our findings so far

Understanding mental health statistics across the UK can be confusing, as each nation collects and publishes information differently. This blog looks at the current state of mental health statistics in England, Northern Ireland and Wales, highlighting what’s done well and where improvement is needed, and what this means for users of the statistics. Our findings have highlighted a few core themes across the nations:

  • There are important gaps in the statistics – for example, there is not enough information about how many people have mental health conditions, what treatment they receive and what outcomes they experience. Users often struggle to answer even their core questions from the published data.
  • Data aren’t joined up – the data don’t follow a person’s journey through the mental health system, making it hard to understand what kind of support works best. There is a clear need for stronger data linkage across healthcare settings (primary, secondary, community) and for more data that follow individuals over the long-term.
  • Statistics are hard to find and use – people often struggle to locate the data and statistics they need, and to navigate different websites, and the strengths and limitations of published statistics and what they can and can’t be used for aren’t always clearly explained.

The state of mental health statistics across the three nations

England: The most developed system, but still with gaps

England has the largest and most-established mental health datasets of the three nations we have looked at. Data collection is mandatory: providers of NHS-funded care must submit data to national datasets including the Mental Health Services Dataset and NHS Talking Therapies. This allows for broad coverage of service activity and means a wide range of statistics are available on prevalence, service use and treatment.

NHS England has also developed a Mental Health Data Hub to address concerns around difficulties accessing and understanding the published statistics. The data hub provides links to a variety of statistical publications and dashboards from a number of organisations.

However, the data still don’t give a full picture of someone’s care journey, especially across different types of services. The quality of statistics can also be impacted by variation in adherence to data submission standards and differing data submission systems across providers. NHS England has done a lot of work to identify and start to address quality issues since our initial review.

Northern Ireland: Longstanding gaps, but major improvements underway

Historically, Northern Ireland has had a lack of reliable mental health data, and systems across the country have been inconsistent. This has limited the availability of official statistics and, in turn, user insight. Northern Ireland’s Mental Health Strategy (2021-2031) highlights its intention to develop a Mental Health Outcomes Framework. Such a framework ‘will help in the evaluation of what works and will ensure services are provided that deliver good outcomes for people while providing value for money.’

Northern Ireland is currently investing in a new electronic patient record system called Encompass. This system is being introduced to create a single digital care record for every citizen in Northern Ireland who receives health and social care. This should help standardise data in the future.

Wales: Data at a crossroads, with plans to improve but barriers to success

Our review of mental health statistics in Wales found that there is limited mental health data in Wales, with gaps on topics such as prevalence, outcomes and inequalities. Different health boards also use different systems, meaning that the data they submit are often inconsistent across the health boards. There is no legal requirement for health boards to submit data to a standard dataset.

Statisticians in the Welsh Government have developed a mental health dashboard that brings existing data together into one place, but both statisticians and users emphasise the need for more person-level, outcomes-focused data.

To address these issues, the Welsh Government has proposed a national core mental health dataset that would link data across different services to track people throughout their care journey. However, progress on this dataset has been slow due to funding issues, IT limitations and the lack of legal requirement for all health boards to take part.

What’s next for OSR’s mental health work?

We have not yet examined mental health statistics in Scotland. We first plan to carry out a scoping exercise to determine what statistics already exist, their quality and what value they provide for users. Our next steps will be shaped by the findings of this exercise.

We will also continue to monitor progress in England, Wales and Northern Ireland, and highlight the need for improvements such as increased availability of data, better data linkage, clearer standards, staff support and improved access to data.

Celebrating Women in Stats and the refreshed Code of Practice: Reflections from Denise Lievesley

Denise Lievesley, CBE, is a leading social statistician whose work has shaped statistical systems in the UK and internationally.

As the former Director of Statistics at UNESCO, she founded the Institute for Statistics. She is also a past leader of the Royal Statistical Society and International Statistical Institute, where her influence on the field is profound. Following all this, her experience was called upon most recently to lead a review of the UK Statistics Authority.

This International Women’s Day is a reminder of the vital role that women like Denise play in ensuring that statistics accurately reflect the diverse experiences of everyone in society to strengthen public trust in data.

Denise Lievesley on Code of Practice 3.0

Launched in November 2025, the refreshed Code of Practice for Statistics 3.0 centres on three standards: Trustworthiness, Quality and Value. Together, these keep users at the heart of statistical production and help build public confidence in official statistics.

OSR spoke to Denise about why the Code matters:

“The Code is really important in underpinning the importance of statistics that are trustworthy, and that’s essential for our democracy.” For Denise, the Code is also a practical tool: “I use it in an advocacy role because I’m trying to ensure that people understand why statistics are critical, the relationship between trust and trustworthiness, and what we do when something goes wrong.”

Code 3.0 introduces new Standards for the Public Use of Statistics, Data and Wider Analysis, setting expectations for how public bodies communicate statistics with transparency, integrity and accuracy.

Denise comments on this development: “I value the new version because it highlights the importance of users and recognises that trust in statistics is a shared responsibility across the wider community.”

As we mark International Women’s Day, it feels especially fitting to celebrate leaders like Denise, whose advocacy continues to promote the refreshed Code to strengthen our statistical system, and enrich our democracy through having credible evidence.

Denise welcomes the opportunity to support Women’s day: “Over the course of my career I have been very happy to see an increasing impact of the technical and professional contributions of women and to celebrate how diversity of statisticians (in terms of their education, backgrounds, circumstances etc) has improved the quality of the questions we can explore with our data.  It is important that statisticians are free to address issues even if they are uncomfortable or challenge received wisdom.”

“Through my various roles in the UN and more recently in academia as well as my connections with official statistics agencies I have been so fortunate to help young people to thrive professionally by inspiring them to embrace their individual paths while drawing on the advice and experience of others, though I recognise that juggling priorities is harder these days.”

Denise concluded by reflecting proudly on her niece who: “In her thirties, is a full professor of theoretical physics. I am in awe of how she and her husband manage to nurture both their two small children, their research and the education of their students.”

Renewed momentum? The Statistics Assembly one year on

One year ago today, the UK’s inaugural Statistics Assembly took place in London. The word ‘world-leading’ can be overused. But it is not out of place to describe an exercise in user engagement on an unprecedented scale.

The Assembly was an inspiring event. It had several hundred attendees, with many more online. It was a crucial step towards the UK statistical system becoming more open to its users. This spirit was exemplified by the speakers: they came from across the UK and internationally, and none of the main speakers was from the Office for National Statistics (ONS).

The statistics system can often be criticised for a lack of transparency in how it sets priorities. Users can sometimes feel that consultation is more about producers broadcasting what they are doing rather than listening. In this context, the openness of the Assembly felt significant, perhaps almost revolutionary.

One year on, what can we say about the way that the Assembly has influenced statistics in the UK?

In short, I am worried that there needs to be a refreshed injection of momentum into this process.

The Assembly: a step change in user engagement

The Assembly was one of the key recommendations made by the review of the Statistics Authority undertaken by Denise Lievesley in March 2024. Denise heard the concerns from users of statistics about a lack of meaningful engagement. She saw the Assembly as a mechanism for addressing this weakness.

The UK Statistics Authority (UKSA), in partnership with the Royal Statistical Society, organised the Assembly swiftly and effectively. It took place 10 months after the Lievesley review was published. It was energising, creative and open. And there was a clear follow-up: in March, the National Statistician’s Expert User Advisory Council, chaired by Professor David Hand, distilled the extensive material generated during the Assembly into a clear report setting four priorities:

  • Reinvigorate sustained and effective user engagement.
  • Ensure user needs for more-granular statistics are met (including small areas, urban/rural, sub-groups of society, under-represented groups and so on).
  • Commit to a significant scaling up in the use of administrative data.
  • Recognise the needs for UK-wide statistics and advocate for, and support, harmonised data where desirable.

So by March 2025, things looked good. The event had happened, it was a success, and a set of priorities had been identified.

A summer of change – and some progress

And then events took a different turn, with the publication of two reviews looking at long-standing ONS problems. First, in April 2025, we in the Office for Statistics Regulation (OSR) published a review of the ONS’s economic statistics, which highlighted deep-seated quality concerns on economic statistics. Then, in June, the Devereux review highlighted weaknesses in the ONS’s leadership, prioritisation and delivery. Since then there have been significant changes in the leadership of the ONS.

Over the last few months, under this new leadership, the ONS has been focusing on delivering an ambitious recovery plan, responding to OSR’s recommendation that it publish quarterly progress updates. It has done so with a commendable focus on openness, and with enhanced engagement with users of economic statistics.

The ONS is of course just one part of the UK statistical system, and it should not fall to the ONS alone to take forward the Assembly’s priorities. And there have been some important steps, by both the ONS and other statistics producers, to implement the Assembly’s recommendations, as set out on the UKSA website in December. Examples of progress include the publication of detailed information on the use of administrative data in the last census; the publication of a new approach to data sources; and a commitment to establish an online ‘trust centre’.

At OSR we have sought to progress the Assembly’s priorities. In particular, the new version of the Code of Practice for Statistics is more user-centric. In our day-to-day work, we challenge and support statistics producers to do better on user engagement. We have published a public involvement and engagement toolkit, which encourages producers to take a much wider lens on who they focus on when they undertake user engagement activities. Our push on intelligent transparency requires government departments to be open and proactive in making data available publicly, and our review of cross-UK comparability in June 2025 made systemic recommendations in line with the Assembly’s comparability theme.

The need to maintain momentum

There are lots of demands on the statistics system. The Assembly is just one of those, and it’s clear that all statistics producers are facing significant resource constraints.

But it’s hard to say that, one year on, the progress on the four priorities has been significant. This is recognised in the December update on the Authority website, which says that “We have not progressed development of the refreshed Authority user engagement strategy as quickly as we would have liked.”

But the Assembly’s priorities remain a powerful anchor for engagement with users of statistics, for two reasons:

Firstly, a lot of people committed time and effort to making the Assembly a success, in the expectation that it represented a substantial reset in how user engagement is thought about and delivered by UK statistics. It is important to realise the benefits of this commitment.

Secondly, there is a question of who statistics are for. In the course of 2025, there were two alternative versions of an answer to that question. The Assembly proposed the answer that statistics are for a broad, vibrant, engaged community of users. The Devereux review implied that the users who really count are the key institutions of the state drawing on economic statistics: HM Treasury, the Bank of England, the Office for Budget Responsibility – at least, for now.

Of course, both answers are correct. Statistics serve the institutions of the state and also a much broader range of users across society. An effective system holds these two sets of users in broad balance – recognising that statistics are for decision makers, but that ‘decision maker’ covers a very wide range of organisations and individuals in society.

Momentum regained?

The key point of this blog is simple. The Assembly represented a breakthrough in the way in which the statistics system opened itself up to its users.  However, it should not be regarded as a one-off – but as an ongoing process, whose momentum must be maintained.

Moreover, the process for appointing a new National Statistician is underway. The new National Statistician, leading across the entire UK statistical system, can inject renewed vigour into taking forward the Assembly’s recommendations.

One year on, my view is this:

Is there a risk of a loss of momentum? Yes.

Can it be regained? Absolutely.

Going beyond consultation to creative conversation about the Code of Practice

In this blog, Penny Babb, Head of Policy and Standards at the Office for Statistics Regulation, discusses her experience of refreshing the Code of Practice for Statistics.


 

As we’ve recently released the third edition of the Code of Practice for Statistics, I am keen to reflect on the experiences that have brought us to this point. I would also like to acknowledge the support of the Royal Statistical Society, and the many statistical producers and stakeholders who have inputted their ideas and views about the Code – thank you! Your contributions have challenged our thinking and enriched Code 3.0.

The long and winding road

In October 2023, we kicked off a review of the Code of Practice and asked the question – is it time to refresh the Code of Practice? Following a wide range of engagement activities, by February 2024, the answer was clear: Code 2.1 had served us well – it was solid, trusted and respected. But in the light of the shifting data landscape, technological advancements and mis/disinformation challenges, it was time to think about how the Code could evolve to meet these challenges. So, we prepared a draft Code 3.0 and invited feedback through a formal consultation, which ran from October 2024 to February 2025. And here we are now in November 2025, after further refinement of the draft Code to address the feedback raised with us, with the latest edition.

Ongoing dialogue

Looking back to two years ago, when the Code 3.0 project kicked off, I’m amazed at how far we have come. We have had so many fruitful opportunities to hear from stakeholders across the statistical system and wider communities to inform our thinking. Our understanding about the Code has evolved through our regulatory work and through hearing about others’ experience and perspectives. In fact, some changes we’ve made to the Code are the product of engagement we’ve undertaken since the second edition was published in February 2018.

One example of how our thinking has been shaped by those using the Code is the development of the Code Principles. These largely stem from the work of the ONS sustainable development goal (SDG) team. In 2019 the team needed a way to test non-official data sources for use in monitoring some indicators. Their work led to our development of a set of universal principles which were relevant to any analyst. In turn, Code 3.0 builds on these ideas to unpack TQV in 10 principles which can be applied by any analyst.

Active engagement

Engagement is not a purpose but a means to establish purpose. Active engagement was core to the development of Code 3.0. ‘You said / we did’ is an approach that is often used in response to consultations. We used it ourselves after the Code consultation to summarise what we heard and to give a feel for how we were planning to then act. But as a way of reporting engagement, it misses out the nuance and depth of the exchange of insight.

By focusing on engagement through listening and responding, we have been able to establish an interactive and iterative process rather than a one-off sharing of ideas. We saw feedback not just as a series of edits to be made but as many insightful points to be carefully considered within task and finish groups in OSR.

Open to check and challenge

An important element of establishing any dialogue is the exchange of understanding and respect. Listening is critical for this to occur, as is a degree of empathy. But for the dialogue to be successful, it requires all parties to be open to hearing from each other. Central to that is seeing yourself as accountable for your decisions and actions and being courageous enough to invite criticism as you determine how best to meet your responsibilities.

We have embedded an accountability framework within our Code 3.0 package, which was developed after input from David Caplan, who has been an important stakeholder throughout our Code development. David has worked for 20 years in the GSS, was director of research and analytics at the Audit Commission, and we are happy to hear is the incoming Honorary Officer for Public Statistics at the RSS. At a Code/RSS event, David shared his thinking about an accountability model based on his previous experiences at the Audit Commission. This inspired us to take our framework forward.

Our accountability framework highlights that you must be willing to hear critical feedback: to give an account, be held to account and make good. Accountability relies on you first seeing yourself as accountable to others – it is an attitude. Once you have accepted this frame of mind, you can then find ways to make yourself accountable.

I love the simplicity of the accountability framework, although I recognise that following its principles is perhaps easier said than done. Unless you are willing to hear what others have to say, to reflect on it and then to set out what you are doing in response – whether you can or can’t, agree or disagree – you will be working in a bubble, isolated from hearing about ways you may need to evolve and grow. This kind of approach isn’t something new in OSR – we always aim to work to the standards we demand of others. In fact, thinking about and enhancing our own TQV will feature more prominently in our upcoming new three-year OSR strategy.

Call to action (and listening)

So, as you check out Code 3.0, see how it can focus your thinking on what matters in your own context. Make yourself accountable to your colleagues and stakeholders – invite and listen to feedback, be open to check and challenge, and be frank about your decisions and the information that’s informed them.



Related articles

What is it about TQV?

OSR launches a refreshed Code of Practice for Statistics: Embedding Trustworthiness, Quality and Value (TQV)

How official statistics producers need to embrace vulnerability in a time of crisis: ISI World Statistics Conference 2025 keynote speech

Director General, Ed Humpherson, recently gave the keynote speech at the ISI World Statistics Conference about how official statistics producers need to embrace vulnerability in a time of crisis. Watch the full keynote speech, or read the transcript below.

“Those of you who have been sitting here for a while and looking at the stage might have noticed that I’ve brought something with me, or a pair of things with me, in fact. You might think of them perhaps as being a prop for my talk, or perhaps you might think of them as being a gimmick. What I’d really like you to think of them, though, is perhaps a symbol, a token, a representation of openness, and a vulnerability. Or even think of them as a gesture by me as your speaker this morning to be open enough to leave a pair of my shoes on the table.

Because that’s what I’m going to talk about today. I’m going to talk about openness and vulnerability and how central they are to you, to us, and our identity as people who care about statistics. That’s my main point. I’m going to unpack it through the lens of various ways of looking at the world various lenses but talking about other things as well.

I’ve been struck, and I’m sure many of you as well have over the course of this week, in amongst all the great positivity about developments that statisticians around the world are putting into place, a kind of an anxiety. An anxiety that we live in times which are difficult for people who work in official statistics. I won’t enumerate all of the examples that might lead us to that conclusion, but you will have heard people talk about the threat of misinformation, political manipulation and political interference, quality challenges.

And it has struck me that people here from an official statistics background might themselves be feeling vulnerable in the face of this poly-crisis, this multiple series of crises. What I want to persuade you of is that actually we should not let any crisis go to waste, and here is an opportunity for people who produce official statistics to demonstrate our secret sauce. Our secret sauce is openness and vulnerability. That’s the point of this talk.

A picture of a pair of shoes outside a house

But I’m going to begin somewhere else. I’m going to begin on the streets of the town I live in, Kingston upon Thames in South London. About four weeks ago, I was walking along near my house and I saw these shoes on a little wall outside somebody’s house, I thought looking quite forlorn, in fact, a rather sad pair of shoes on their own. And my first thought was that somebody had lost them, and they had been placed there in case the parent came back to pick them, pick them up. But then I realised that as I looked around, there were other examples in nearby streets of people leaving things out outside their house. And what they’re doing is, they’re leaving things outside their house for other people to collect.

So here are three examples there are people who say, I’ve got some stuff in my house, I don’t need it, but somebody might. So I don’t know if you can see, but there’s a box here, it says ‘free’ it’s a very sort of casually presented box, but it’s got some stuff in it people might like. And here, a little basket of scarves. Somebody’s obviously getting rid of their scarves. Somebody might like them. It’s a phenomenon of free exchange. And one of my thoughts about this has been, a few years ago, I don’t think I saw this very much. I don’t think I saw people putting things outside on the walls of their house, from outside their house, for other people to take.

You also see this phenomenon on the internet – free-exchange websites where people put things for other people to take. I thought that was interesting. But I put that thought away and went back to my day job. And then I started to realise there was something connected to our work, which I’ll come on to.

But what I want to say today is this: I will start with the evolving nature of trust. Then I want to talk about official statistics serving the public good, and then my organisation’s role, and then I want to talk about the crises and how we’ve responded, and I want to close by really driving home this point about vulnerability.


The evolving nature of trust

The evolving nature of trust. I’ve been reading a lot about trust recently. The literature has revolved around a paper written in 1995 by an American academic called Mayer. In Mayer’s model of trust, he says: for person A to trust organisation B, they need to perceive that organisation B has competence, integrity and benevolence. So that’s the standard. It’s called the CIB model: competence, integrity, benevolence. What’s interesting is how understanding of trust has evolved.

More recently, sociologists and psychologists who’ve looked into this say that over time, of those three, benevolence is becoming more important. That instead of trusting somebody because of their competence, or their integrity, you most need to trust that they are benevolent towards you. That they share your interests and they act in your interests.

The second element of this evolution is the observation that trust is shifting from being something which people look upwards to institutions to provide – the state, the church, a charity perhaps, into something which is more relational, more horizontal, more equal. It’s to do with interpersonal relations of shared equality, not of hierarchy. So these are two ways of thinking about trust. One is a shift away from just competence and integrity towards benevolence being more important. And the second is this shift from institutional and vertical to horizontal and relational.

The ‘shoes outside the house’ is an illustration of that, because I’m pretty certain that 15 years ago, the same people would’ve given their unwanted things to a church or to a charity to sell. They’d have gone upwards to an institution, whereas now they’re bypassing, disintermediating the institution. I also think it’s quite a vulnerable thing to do. There’s some vulnerability about putting something personal of yours out for everybody to see. So that’s the evolving nature of trust that’s going to come back in this talk in a few minutes.


Official statistics and the public good

Hold that thought while I just say a little bit about official statistics and the public good. And I have to make an apology here. The more I’ve been here this week and heard other people talk – I’ve heard Walter Radermacher say this, I’ve heard Steve McFeely say this – that maybe there’s something about the term ‘official statistics’ that isn’t quite right, that maybe that sounds too austere and too distant, too remote. And maybe we should talk about public statistics. And in the UK, the Royal Statistical Society has done a very nice body of work promoting the idea of public statistics. And I hope that in future years, if I was to do a talk again, that’s what I’d say. I’m using the term official statistics now because that’s a term that we commonly use. But just to say, I acknowledge the limitations of the term official statistics.

Anyway, official statistics should serve the public good. This is the definition we use in the UK. I won’t read it out, but it emphasises that they are public assets.

Statistics serve the public good as public assets that provide insight, which allows them to be used widely for informing understanding and shaping action.

To fully realise the potential of statistics, the organisations producing them should  place serving the public good at the heart of their work and be conscious of their responsibilities to society.

They are for the public. They’re about the public, and that is what enables them to be used widely for informing, understanding. And moreover, my organisation – which I’ll explain in a second what we do – we believe that to fully realise the potential, organisations, NSOs, official statistics producers must place the public good at the heart of their work. That is essential.

In my view many official statistics producers convey their authority as if they’re a central bank. This is a photograph of the Bank of England. It’s a very powerful, robust facade. It emphasises its national nature, there’s a flag flying on top, it’s got strong pillars. Everything about it is conveying solidity and reliability, and confidence. It’s all competence and integrity. I think official statistics producers often convey to their societies their role, their authority, in this way. They say: We have very strong methods, we have very strong data ethics, we have tremendous frameworks for statistical disclosure control. We have competence, we have integrity. What I worry about is, if I’m right that the nature of trust is shifting, then competence and integrity aren’t enough.

The far strong facade of the Bank of England, the strong facade of an NSO, does not convey warmth or benevolence. It does not convey relationship. And I think that could be a growing challenge. Lots of the things which have happened in different ways around statistics might have something to do with this, as well as the more obvious causes. So I think benevolence and openness are really very important.


The Office for Statistics Regulation

So, what’s my organisation? We’re rather unusual in the global official statistics community. Not unique – as I’ve learned very well this week from Martine Durand at the French Statistics Authority, which does very similar work to us, but we’re certainly unusual.

We are a separate body and we are responsible for setting the UK’s Code of Practice for reviewing compliance with that Code of Practice. And when we review compliance, I mean, we really review it because if we find statistics that are not compliant, we remove their accreditation as official statistics, which is quite a strong intervention. We act as a catalyst for systemic improvements. We do reports on things like data sharing or on cross-UK comparability so that a user could compare statistics in Scotland with those in England and Wales and Northern Ireland.

Perhaps what we may be most known for is that we are willing to stand up for the use of statistics – the appropriate use of statistics in the public domain. What that means is that we will sometimes make public statements about the ways in which statistics may be being misinterpreted or being misleading for the public. I think that’s maybe relatively unusual in the world of official statistics. But if you believe everything that I’ve said already, you can see how important it is to serve the public in this way.

So that’s what we do. Everything we do is built around three lenses on statistics which we call TQV: Trustworthiness, Quality and Value. Trustworthiness is about the organisation that does the producing, and I suppose this is closest to the competence and integrity part of trust. It’s about the organisation’s practices, its commitments, the ways in which it manages itself. Quality is about the data, how data are collected, how then they’re converted into aggregate statistics, the strengths and limitations, the uncertainties that surround those statistics. Value is about what the statistics mean for users in society.

So we always use this framework. Everything we look at will have a T and a Q and a V component. We think looking at any one of those in isolation will tend to only give a partial picture. So that’s TQV.

For today’s purposes, what’s so interesting about TQV is it’s really all about openness. Trustworthiness is all about being open to recognise your errors. If you find an error as a statistics producer, you don’t hide it away and think that’s embarrassing. You notify your users promptly, openly, clearly, because that is you doing your job. You always recognise the limitations and imperfections of the statistics that you produce. And you should not do that just in a very technical way, presenting your confidence intervals, for example. You do that in a way that helps a user know what they should or should not infer from the statistics. Openness to uncertainty, understanding, conveying that statistics are always an imperfect measure of what they’re trying to capture. Being honest and clear about that uncertainty. Openness to users, always having an open door to understand what users want, how they use statistics. And finally, and I think this is probably underplayed often, it’s openness to dialogue. People will challenge the statistics that are produced by statistics bodies. They will see a flaw in them. They will say, hang on, I’ve delved into your underlying data, or I’ve looked at your metadata, and something isn’t right here. Welcome that – that is absolutely fantastic. That is your best friend in terms of quality control. Be open to dialogue, debate, different voices and different perspectives. And if you’re open in all of those ways, you will be demonstrating vulnerability. You’ll be opening yourself up. You will metaphorically be putting your shoes on the table, or be putting your shoes on the street. That’s the secret sauce. That’s what TQV is all about.


Three challenges that have faced UK official statistics

So the three challenges that I want to talk about, which I think illustrate these themes in practice. The first is the way in which statistics have become weaponised in political discourse. By weaponised, I mean, instead of statistics providing an insight to a user, they get used in political debate as a way of repeating the same number over and over again, almost until it loses all its meaning as a way of sort of hammering home a simple and often misleading message. That’s the weaponisation of statistics.

Second, we know that we live in a world where users demand real-time or as close to real-time data as they can. That’s a difficult demand for statistics producers to meet.

Third, and very specifically in the UK, I’m going to talk about the challenges which have faced and are facing economic statistics, which Roeland alluded to earlier.

So, weaponisation. I’ve got a picture here of the most famous weaponised statistic in recent British political history. This is to do with the Brexit referendum. There was a statistic pasted on the side of a bus. My organisation, OSR, we got involved in highlighting the ways in which that might be misleading.

For a long time as a result of that experience, I thought that our job was to highlight something which was wrong and say it was wrong, and it was only subsequently that I realised there was a much bigger and more important story around this whole thing, which is in the phrase ‘that’s not my GDP’. For those of you who don’t know it, this is the story, it’s a real story. During the debates around whether Britain should vote to leave the European Union or not, an economist was presenting to a town hall meeting in the north of England and he said, well, one of the things about Brexit is you might want to think about the impact on GDP. And he said, ‘GDP has been rising, maybe we don’t want to endanger that’. And a lady in the audience put her hand up and said, ‘that’s your bloody GDP, it’s not mine’. And what is meant by that is, that’s kind of a big abstract number. You know, I don’t eat GDP. I don’t clothe my children in GDP. My children’s shoes are not GDP. I want to know about something which is meaningful to me.

I’ve always thought that is such a great way of illustrating the ways in which statistics, when they’re weaponised, can become remote from the people that they’re about. If people say those are not my numbers, I think we’re in trouble. So, over the succession of things I’ve listed there, starting with the UK’s contribution to the European Union, and then claims about educational performance, and then the COVID pandemic, particularly debates around vaccines, and more recently, debates about migration and debates in our 2024 general election. I’ve shifted my view from being it’s our job to highlight what might be called ‘lying scoundrels’. Frankly, I don’t think that anymore. I think it’s our job to represent the people who might not see the numbers being relevant to them. That’s the way to respond to weaponisation of statistics.


Excess demand of real time data

This little screenshot I’ve got here is of a fantastic tool that the UK Department of Health produced, which was a real-time COVID dashboard, updated daily, which gave real-time picture of infections, hospitalisations, deaths and vaccine rollout when vaccines came. You could visit it, you could click down to an incredibly low geographical area. Updated daily. It was incredibly popular. At its peak, it had about 70 million hits a day, which is an astonishing number for a single government website on a single day.

We responded to this. You might say that a statistics regulator would, might, suck our teeth about all of this dirty data going out. You know, where’s the quality assurance on that? But in fact we really supported it. We did some rapid reviews. We allowed them to release it not at the standard time of 9:30, but at 4:00 PM, because that was the time when the daily government press statement came out. We wanted to make sure that the information was available to everybody at the same time. We continually focused on the transparency of government evidence. And, we said, you know, there’s a code of practice, which we obviously love, but the most important thing are the principles of the Code. And as long as the people producing this kind of dashboard followed the principles, perhaps the detail of compliance matters less.

The final kind of crisis is the one which is, I guess, upon us now, which is the quality of UK economic statistics. And there are many ways of telling this story. I’m going to focus in on survey response rates, which I guess has been sort of the most proximate public reason that people have become concerned about economic statistics. I have two charts here. The upper one shows that the UK has always been towards the bottom end of response rates for Labour Force Survey statistics. The UK is the heavy black line; the other lines are other European countries. The lower chart is a different survey, the Living Cost and Food Survey, showing – because it’s too small, you won’t be able to see the time period, but really, in the period more recently after the pandemic – a really significant drop off in the response rates to that survey, rendering it increasingly unrepresentative. That’s the problem: the lack of representativeness.

Our response to this process of declining quality is a different kind of openness. Maybe it’s a painful kind of openness. It’s the openness which says, these statistics are no longer as reliable as they were. As a result, we have removed the official statistics designation from several statistics produced by the Office for National Statistics. I’ve listed them there. We also did a thematic review looking at economic statistics in full earlier this year, which said: there are really significant problems, and the first step for ONS to get its head around these problems is for ONS to fully recognise them as problems and then set out plans to respond to them. It’s no good just to say, don’t worry, we’ve got a plan, nothing to see here. Open up and say there are real problems. And the great news is the ONS has done that. That’s my bottom box here. There’s an excellent recovery plan now, which fully recognises the issues, and is showing the openness which has been the theme of my whole talk.

So what have I learned from these challenges? In a sense, what’s my vulnerability that I can share with you? The first is I made a mistake earlier on in doing this job. I thought we were campaigners to defend the truth. I no longer think that. I don’t think that’s the right way to think about statistics, because it sort of puts statistics on a pedestal they can’t possibly ever sustain. What we are is we’re advocates of a kind of transparency in the use of statistics that enables an intelligent user to understand what they mean. That’s what we do. When we make an intervention, when we write to a minister, when we make a statement in the media, that’s what we’re doing. Intelligent transparency. It’s not this sort of fact-checking, truth-correction role. I’ve learned that.

The second thing I’ve learned is that in times of high demand, an organisation like mine can be a real pain in the ass. We can say, you know, you need to do things in the right way and in the right order. And I think it’s really essential that we demonstrate flexibility, that we support producers in their responses. And then the third thing is, I talked about the succession of problems with economic statistics. I honestly think we could have highlighted the problems more clearly. I think everything that has happened, we said, you know, we picked it out as a problem. I’m not sure if we did it quite loudly enough. And that, that is one of my reflections and one of the things I say in the mirror in the morning is maybe I could have done a better job there.

The key point though, through all of this is, I think what I’m trying to get across is that the technocratic distance of being ‘we are the official statistics body, we have the best statistics’. That won’t help any of you who are OS producers. What will help is openness and vulnerability.


Openness and vulnerability in practice

So I just want to have my final section to say what I think that means in practice. The first thing I think it means is, you should be on the front line, you should be out there, you should be engaging with users. There are actually some great examples that the ONS in the UK is doing to encourage citizens to respond, to see if response rates can be increased by direct and appealing engagements. The two screenshots I’ve got are of two approaches. ‘It starts with you’, is the tagline. I think it’s rather nice and although I’m not sure whether the evidence is fully in, I understand that the early signs are, it really does help. So the lesson is: be on the frontline, engage, go out there and engage. Whilst doing that, recognise, although I’ve used the word public, of course, that’s a really silly thing to say. There’s not a public; there are multiple publics. There are different people in different places, in different communities who have different interests.

And just to illustrate that, we produced a piece of work, my organisation, earlier this year called ‘Statistics in Personal Decision Making’, where we went out to both a survey panel, but also some focus groups, and asked people, individuals, how they use statistics in their daily life. And it was completely fascinating. The answer is sometimes not at all, and sometimes much more than you’d expect. But that kind of insight of saying there’s a public out there, which is not really very well served by our current notion of users being sort of experts, is really important. So recognising not a single public.

And then three things to avoid. Avoid distance. If you’re in an OS producer and you ever think like, well, we’re conveying ourselves as this authoritative, ‘best producer of statistics that there are, you can trust us because we have all of this competence’. Well, you’re probably going down the wrong path. Associated with that is don’t ever fall into the trap of thinking the statistics you produce are perfect, that they’re a perfect single point. I always get really worried when ONS will publish a statistic and some expert user will say, I think it’s flawed here. And the ONS response can sometimes be ‘this is the best available estimate, or this is using the best available methods’. I always think that’s so dismissive. Don’t find yourself saying that. It’s a mistake.

And the final one. I always wanted to be the last person in the world to have an AI component to my talks because I don’t know anything about it, basically. But then I realised I can’t really do that – we’re in the world where everybody has to talk about AI. So this is my little nod to there being an AI thing out there.

I think there’s a tendency in the data and the AI world to regard data as a block of stuff that you shovel in. And I love this image of a man shovelling coal into a steam engine. It summarises the way AI is conceptualised. I believe there’s a tendency in the big data and the AI world to think of statistics being just another thing that you shovel in and then some magic happens inside the furnace. I think we should be really wary of that, because of course, we all know that the richness comes from the insight, from the metadata, from the limitations as much as from the strengths.

So that’s my attempt to say something about AI! I don’t think it’s very convincing, but I thought I better put it in because everybody talks about AI in their presentations these days.


Conclusion

So here is my conclusion.

If trust is shifting from institutional to relational,

and the key to trust is now benevolence,

then official statistics producers need to avoid technocratic distance

and instead they must demonstrate openness and vulnerability.

That’s my argument. And just to put it another way, visually: if you find yourself thinking like a central bank, all pillars and solidity: don’t. Don’t think like that. Instead, think of this: the shoes in the street.

Thank you.”

What is it about TQV?

Helen, our Head of Development and Impact at the Office for Statistics Regulation (OSR), reflects on why the principles of the Code have endured and flourished, even as the social and technical landscape has changed around them.


OSR has recently refreshed its Code of Practice for Statistics, which sets the standards for the production and onward communication of official statistics across government.

Code 3.0, the new Code, has evolved in several important ways since the previous edition, reflecting substantial changes in the data landscape. These include an increased demand for statistics from users, and developments in the ways that statistics are produced and disseminated. However, the core principles of the Code – Trustworthiness, Quality and Value (TQV) – remain, providing the framework for a Code that ensures statistics serve the public good.

The principles of TQV are simple:

Trustworthiness is about building confidence in the people and organisations that produce statistics and data.

Quality is about using data and methods that produce assured statistics.

Value is about ensuring statistics support society’s needs for information.

Since the TQV framework was introduced in Code 2.0, it has become firmly embedded across the statistical system. This includes within OSR – forgive me if I’m wrong, but I’m not sure any other government analytical standards have made it onto a team-leaver’s gift hoodie!

 

Hoodie with TQV logo

 

So, what is it about TQV that lends itself to this persistence (and devotion)? I have a three-part theory:

1. TQV is intuitive

Given common definitions and understanding of these concepts, it’s hard to imagine that many people would consider trustworthiness, quality and value bad things to see in statistics, or indeed to see in most things in life! Committing to uphold the standards of Trustworthiness, Quality and Value when producing and communicating statistics should mean, for an end user, that the numbers they’re using to inform their understanding and decisions are reliable tools to understand the world. Would anyone ask for the contrary?

2. TQV is universal

TQV provides an ethical framework that can support the publication of any kind of analysis or evidence. It is relevant for, and can be used by, anyone in any organisation who wants to ensure that the information they provide is relevant, informative and impactful. TQV does not prescribe a particular way of producing or presenting statistics; instead, the focus is on building public confidence and delivering public good, which allows for and encourages the adoption of new and innovative approaches. And TQV remains robust in the face of threats to the public good, helping organisations to demonstrate their commitment to public interest and to mitigate concerns about misinformation.

3. TQV is easy to remember!

This sounds frivolous, but it’s important. I’ve worked as a government statistician for 5 years and a regulator of statistics for 6, and I still don’t know every practice of the Code off by heart. There is probably a statistician out there that does, but I haven’t met them yet. However, I can remember TQV. My understanding of what these concepts mean and the outcomes they can support, even at a high level, has consistently enabled me to make informed decisions and act, advise and (I hope) inspire others to act in a way that is consistent with working towards serving the public good. I think others out there, in many different roles relating to data and statistics, could say the same.

So, there you have it – my theory on why TQV is so powerful. We’d love to hear your own views and experiences of using the refreshed Code. If you’d like to get in touch, please email regulation@statistics.gov.uk.

Trust in Statistics: Launching the Refreshed Code of Practice

OSR asked CEO of the Royal Statistical Society(RSS), Dr Sarah Cumbers, to reflect on the refreshed Code of Practice in a guest blog. The RSS works closely to support OSR’s regulatory work, including partnering with us on the TQV (Trustworthiness, Quality and Value) annual award for voluntary application, that celebrates organisations adopting these principles in their daily practice for everyone to see.

The Code is the essential foundation for our statistical system, underpinning the public trust society relies on by clearly communicating what matters when working with data, and driving up standards.

The RSS has been closely involved in the Code’s review process. We were pleased to host two dedicated roundtables with the OSR, one in November 2023 and another in late 2024, to give our members an opportunity to share their views and engage with OSR on how the Code could evolve.

It’s genuinely encouraging to see how many of the issues raised in those sessions are reflected in the revised Code, including the need for stronger user engagement. This direct line from member input to policy change shows the strength and influence of the statistical community when we speak together. We also submitted a formal response to the consultation, underlining our call for users to be placed at the heart of decision-making.

The Code is much more than a set of standards on paper; it’s a guide to best practice that supports better decision-making across the UK.

Independence is a central theme. In a period when the relationship between statisticians and politicians is under scrutiny globally, the Code’s emphasis on protecting impartial professional judgement is crucial. The integrity of official statistics depends on statisticians being able to work freely and professionally, and the Code provides a vital safeguard for us in the UK.

There’s always more work to do to ensure statistics fully serve the public good, and the RSS is particularly keen to see further progress on user engagement. The revised Code’s call to put users at the centre of the system provides a valuable catalyst for this, and we are looking forward to working with OSR, ONS and the wider government statistical service to discuss and agree exactly what that should mean in practice across the statistical system.

The Code matters because, above all, trust in statistics matters. The refreshed Code is a great step forward, and the RSS looks forward to supporting its use and continuing this conversation with both the OSR and our members.


Related

Code of Practice for Statistics

Code of Practice for Statistics 3.0 – What has changed