Embracing Challenge for Change

Hear about the Office for Environmental Protection’s journey of applying Trustworthiness, Quality and Value (TQV) from Darren Watson, their Principal Environmental Analyst, in this guest blog on embracing the Code of Practice for Statistics

How would a new organisation understand, and most importantly communicate, how – or indeed whether – the government is improving the natural environment? How does it make sense not only of the state of the environment but also the many policies and strategies across government that affect it?

These are the challenges that the Office for Environmental Protection (OEP) has had to confront since its creation a little over 3 years ago.

One way to examine progress is through statistics – in our case, using data such as the area of woodland, or the number of water pollution incidents to present trends in key environmental indicators.

Deciding what to measure and what to present in a concise, understandable way that provides value to stakeholders, and ultimately contributes to improving the environment, is tricky. The difficulty lies in the range of policies and strategies in place, government targets and commitments, the numerous sources of pollution and their wide-ranging impacts, and the priorities and concerns of our stakeholders. Just listing some of our stakeholders – parliamentarians and policy makers, policy influencers, the media and the public – has been enough to make my head spin.

The challenge of measuring progress was evident in our first annual report on government progress in improving the environment, which we laid before Parliament in January 2023. At that stage we presented 32 indicators to measure the environment in England.

The work doesn’t stop there, however. Our progress reports need to evolve in response to the environment and policy landscapes, so we can never stand still. Our team therefore challenges itself to continually improve our assessment, to provide greater value for our users and the environment, and to respond to stakeholder feedback.

So, while we use others’ statistics (from bodies like the Environment Agency), rather than produce them ourselves, we are committed to applying the same high standards to our analyses.

As such, we decided to voluntarily adopt the Code of Practice for Statistics when developing our second progress report. The Code sets the standards to which producers of accredited official statistics (previously called ‘National Statistics’) are held. It is, in short, the best way to hold ourselves and our assessment to account, through striving to meet its three pillars: Trustworthiness, Quality and Value (TQV).

The most visible innovation in our application of TQV was our first Methodological Statement, published alongside our second progress report. At 95 pages, it was not lacking detail. But producing a report is the easy part; it is then how it is used, and the value it provides, that is most important.

So, a particularly proud moment for the team that produced the Methodological Statement came when our chair used it at the House of Lords Environment and Climate Change Committee to demonstrate the robustness of our recommendations. A tick in the box for trustworthiness and transparency.

But there is another example of our more fundamental use of TQV and its true value. And this goes back to our indicators.

Because the environment, the factors affecting it, our stakeholder needs and government are dynamic, our indicators must be too. They must adapt, and this is where using TQV – particularly the Q and the V – is key.

Our assessment process does not stop with the publication of our latest progress report every January. Following publication, we take stock of our progress and review and consult on our assessments. We also review those indicators, which is where the lenses of Quality and Value give us an ideal framework through which to challenge ourselves and be challenged by our stakeholders.

For Value, we ask ourselves:

Are our indicators still relevant to users? Do they, as a whole and individually, provide clarity and insight without making our assessment inaccessible? And do they improve and enhance our understanding, and our users’ understanding, of the environment?

For Quality, we consider:

Are we still using the most-suitable data sources? Have the data producers changed anything or stopped updating their statistics? Are our methods still sound and reflective of best practice? And how can we improve our data handling and quality assurance?

This is quite a list. But practically, the QV challenge has enabled the evolution of our indicators, with 23 new indicators included and 16 amendments made to existing indicators through our second and third reports. This work is driven by our vision to better understand those numerous factors that determine progress in improving the environment and to provide greater value and quality for our users. We hope this effort is demonstrated through the increasing awareness and influence of our assessments and their recommendations.

Our commitment to TQV and continuous improvement goes beyond this work. We are using TQV to examine how we assess trends and whether more statistically robust methods are available. We are also building an improved data system to increase the accuracy, quality and speed of our trend assessments, whilst supporting our ambitions to present data more effectively, including using maps.

So, to conclude: the OEP aims to be a trusted voice, and we are committed to being an evidence-led organisation that holds government and other public authorities to account. What better way to support this than to adopt TQV and show the users of our reports the effort we take to meet those aims?

Oh, and please take a look at our reports – we really do value feedback.

Beyond GDP: Redefining Economic Progress

Gross domestic product (GDP) is invariably a focus of debate about the state of the economy and whether this country, or any other, is making progress.

Yet it has its critics. Some complain that the focus on GDP as the single measure of progress distorts our priorities. They argue that GDP blinds us to many other important ways in which society should flourish.

There are indeed good grounds for thinking GDP is an incomplete measure of societal progress. Case in point: it has up to now omitted the depletion of the natural world, though this will be addressed in the upcoming updates to the international standards for the compilation of GDP – the new System of National Accounts WS.6 covers depletion.

Moreover, GDP does not capture lots of types of worthwhile activity that are not paid for (caring for a relative, for example) but does capture activities that are not worthwhile (like trading in illicit drugs).

Over 2024, we saw the maturing of several endeavours to address this issue. They fall into two camps. The first involves using the framework of GDP (or national accounts, to be more precise) to create a more comprehensive measure of growth and wealth. The second looks to develop adjacent measures that focus more directly on well-being and prosperity.

Let’s begin with approaches that focus on GDP.

One solution to this problem is to enrich the idea of GDP – to measure more things within the concept of the economy, like income, capital and growth. For example, GDP could measure the value of work done in the house and could incorporate the depletion of the natural world. A recent speech by the UK Statistics Authority Chair, Sir Robert Chote, highlights international work to widen what is captured as capital in economic measurement, and in particular to include natural capital.

A good example of an attempt to enrich GDP is the ONS’s recent inclusive income release. It supplements the standard GDP measures of output with measures of unpaid work, the costs of depleting the natural world and some elements of cultural wealth. The ONS summarises it well:

“Inclusive income estimates provide a broader measure of the economic welfare of the UK population. They reflect the economic value of both paid activity, included in gross domestic product (GDP), and unpaid activity, which includes ecosystem services and unpaid household services. The result is measures of economic progress that include activity and assets beyond those currently included in GDP.”

It’s an interesting endeavour, and provides some notable insights. For example, unlike GDP per person, inclusive income per person has not yet returned to its pandemic peak. In short, I applaud the ONS’s ambition in taking on this difficult and methodologically challenging work – though it has initiated a lot of debate within OSR, which my colleague Jonathan Price will highlight in a subsequent blog.

The second approach suggests that we should keep our focus on GDP more or less as it is (subject to the usual improvements and use of better data sources, as well as the communication of uncertainty; see our GDP review on this). And instead of extending the framework, it proposes supplementing it with meaningful alternative measures, including personal well-being measures, which focus on the things that GDP does not capture well. The ONS in fact provides an important foundation for these alternatives with its Measures of National Wellbeing.

A great example of the personal well-being approach is provided by Pro Bono Economics (PBE)’s recent report on the state of well-being in the UK, which estimates the number of people in the UK with low well-being. The report highlights what can only be described as a crisis of low well-being in the UK. (And full disclosure: I am a trustee of PBE).

The PBE report is not the only work that focuses on non-GDP measures of well-being:

  • Along similar lines, the BeeWell project has proposed measuring children’s well-being, and has been implemented in Manchester, Hampshire and the Isle of Wight.
  • Carnegie UK’s Life in the UK provides a comprehensive overview of well-being. It extends the analysis from personal well-being to broader societal well-being, including perceptions of democratic health.
  • Complementing this UK-level perspective, the Global Prosperity Institute’s work is also noteworthy. It is more granular and micro, considering the prosperity of small areas using a citizen research approach. Its application to areas of East London is rich in insights into the experience of redevelopment.

What these various outputs show is that the “Beyond GDP” space is maturing. The ONS is doing some thoughtful, innovative things to extend the framework of national accounts. And a plethora of independent approaches are emerging.

So I begin this year optimistically.

Could 2025 be the year Beyond GDP moves from being a slogan to a reality for policymakers, Parliament, media, and, most importantly, citizens?

 

 

Lessons in communicating uncertainty from the Infected Blood Inquiry: What to say when statistics don’t have the answers

In this guest blog, Professor Sir David Spiegelhalter, Emeritus Professor of Statistics at the University of Cambridge, reflects on his experiences in the Infected Blood Inquiry and the importance of transparency around statistical uncertainty.

In my latest book, The Art of UncertaintyI discuss the UK Infected Blood Inquiry as a case study in communicating statistical uncertainty. In the 1970s and 1980s, tens of thousands of people who received contaminated blood products contracted diseases including HIV/AIDS and hepatitis. Many died as a result. This crisis, with its catastrophic consequences, was referred to as ‘the worst treatment disaster in the history of our NHS’.

The Infected Blood Inquiry was set up in 2018 after much campaigning by victims and their families. I was involved in the Statistics Expert Group established as part of the Inquiry.

Building a model for complex calculations

Our group was tasked with answering a number of questions surrounding the events, such as how many people had been infected with hepatitis C through contaminated blood transfusions.

Some conclusions were relatively easily reached. We could be reasonably confident in data and its verification, such as that around 1,250 people with bleeding disorders were diagnosed with HIV from 1979 onwards.

Other figures proved much more difficult to estimate, such as the number of people receiving ordinary blood transfusions who were infected with hepatitis C, before testing became available. We needed a more sophisticated approach that did not involve counting specific (anonymous) individuals but looked at the process as a whole. Consequently, we established a complex statistical model to derive various estimates. However, due to the lack of data available for some parts of the model, expert judgement was at times necessary to enable it, so we had to account for multiple sources of uncertainty.

Using this model, we were able to produce numbers that went some way to answering the questions we were charged with. However, some figures came with very large uncertainty due the inherent complexity involved in their calculation, so we could not be reliably sure of their accuracy.

A scale for communicating uncertainty

To prevent people from placing undue trust in our findings, we wanted to express the considerable caution that should be taken when considering our analysis. For this, we found the scale used in scientific advice during the COVID-19 pandemic to be a helpful model, in which confidence is expressed in terms of low through to high.

This scale was liberating; it allowed us to clearly convey our level of confidence in a way that accurately reflected the reality of the numbers. So, we could say that we only had moderate confidence that the available data could answer some of the questions we had been asked. And for others – for example, how many people had been infected with hepatitis B – we refused to provide any numbers, on account of having low confidence in being able to answer the question.

Lessons for the statistical community about communicating uncertainty

It can be difficult to admit to substantial uncertainty in data when dealing with a tragedy such as this. In the case of the Infected Blood Inquiry, this lack of clarity meant that the victims and their families were unable to have answered, in any precise way, various questions for which they deserved some kind of closure.

It is also undeniably important, however, that those producing statistics are open about how confident they are in their numbers, so that people understand when statistics can reliably answer their questions, and when they cannot. Indeed, being transparent about any uncertainty in published data is one of the principles that the Office for Statistics Regulation (OSR) promotes in its intelligent transparency campaign and its championing of analytical leadership to support public understanding of, and confidence in, the use of numbers by government.

Intelligent transparency demands that statistical claims and statements are based on data to which everyone has equal access, are clearly and transparently defined, and for which there is appropriate acknowledgement of any uncertainties and relevant context. This concept helps us understand how to communicate our findings when we are asked to answer questions regardless of the quality of available evidence. And it acknowledges that publishing numbers without appropriate context, clarifications and warnings is counterproductive to providing real public value.

So, when it comes to communicating statistics to the public, honesty – or transparency, as we call it here – really is the best policy. I am delighted to see OSR placing more emphasis on intelligent transparency, and how statistics are communicated more generally, in its proposals for a refreshed Code of Practice. Ed Humpherson has also written an excellent blog on why communicating uncertainty is a constant challenge for statisticians.

Culture, psychological safety, and the impact on quality

As part of our series of blogs examining topics addressed in the proposed refreshed Code of Practice for Statistics, Dr James Tucker, Deputy Director for Health, International and Partnerships at the Office for National Statistics, shares his thoughts about psychological safety.

We’ve all heard the phrase “we need a culture change” at some point in our careers. But what exactly does the culture of an organisation entail, and how does it influence the quality of our work? Without a deep understanding of what constitutes culture, it’s easy for it to become just another buzzword.

My interest in this topic grew significantly when I established the Government Data Quality Hub, which aims to improve data quality across government. While having the right standards, guidance, and methods is crucial, these need to be coupled with the right organisational environment to facilitate quality improvement.

This exploration led me to the concept of “psychological safety”. Psychological safety refers to an environment where everyone feels respected, can share their views, and has positive intentions towards one another. In the context of improving the quality of our statistics, this means people feel comfortable raising issues, challenging the status quo, and proposing new ideas without fear of being perceived as negative or intrusive.

Have you ever had an idea that you believed could lead to significant improvements but hesitated to share it due to fear of negative consequences if it didn’t work out? Unfortunately, this is a common scenario and is an example of a cognitive bias called loss aversion. This bias can lead to various unwanted outcomes, such as unshared ideas and overlooked errors. In summary, silence enables mistakes and prevents improvements.

Creating a psychologically safe environment is a collective responsibility, but it is particularly important for leaders to take the initiative. This goes beyond simply saying “please speak up” or “my door is always open”. It involves actively inviting contributions, creating various forums and groups for the exchange of ideas, and ensuring people have opportunities to get to know one another. Governance plays a critical role here; people need to know where to raise issues, how to escalate them, and who is responsible for what.

I strongly believe that psychological safety is fundamental to developing a quality culture in all organisations. I am always delighted to discuss this topic further, so please feel free to get in touch with me.

The Office for Statistics Regulation (OSR), in its proposal for a refreshed Code of Practice for Statistics, guides statistical leaders to “encourage a quality culture that promotes good practice”, including by “provid[ing] a safe environment and support[ing] staff in raising quality concerns”.

We believe that when producer teams feel safe to share new ideas and raise concerns, and are supported in these efforts, the public can have greater confidence in the quality of the statistics delivered.

OSR has itself actively sought and embraced feedback from a range of people in developing its proposal. It is through these conversations that we hope to further strengthen the Code and increase its value not just for the public but all those who use it.

In the spirit of welcoming challenge and being open to innovation, we continue to invite all views on the proposed changes to the Code in our consultation.

 

 

Quality under challenge: Regulating statistics and data from the Labour Force Survey

In our latest blog, Head of Assessment, Siobhan, discusses the challenges of regulating statistics and data from the Labour Force Survey.

Many concerns have been raised about the quality of the Labour Force Survey (LFS) produced by the Office for National Statistics (ONS), the statistics produced from it and the challenges ONS faces in delivering the online Transformed Labour Force Survey (TLFS) that will replace it.  

So, what are we, as the Office for Statistics Regulation (OSR), doing about it? Our key interventions include: 

  • removing the LFS’s status as accredited official statistics to signal quality concerns to users 
  • where there are quality concerns, removing the accreditation of statistics based on LFS data and associated Annual Population Survey (APS) data. To date, we have removed the accreditation from 14 statistical outputs 
  • setting requirements for ONS to improve its communication and engagement, and to consider the lessons that can be learnt from the LFS 
  • reviewing ONS’s work to develop an online replacement for the LFS, the TLFS 

What is the Labour Force Survey?

The LFS is the main household survey providing labour market information in the UK. It is ONS’s largest regular household survey, outside of the census, and has been running since the 1970s.  

Statistics calculated using the LFS (and the related Annual Population Survey, also produced by ONS) are vital to understanding the world we live in. These statistics are used to estimate the number of people employed, unemployed and economically inactive across the UK, in different parts of the UK, and for different groups, such as young people, older people and those with disabilities. These statistics also inform key economic decisions, such as interest rates set by the Bank of England and government tax and spending. 

De-accrediting statistics derived from the Labour Force Survey 

OSR’s primary regulatory role is to independently review statistics to ensure that they comply with the standards of Trustworthiness, Quality and Value in the Code of Practice for Statistics. When they meet these standards, we give them ‘accredited official statistics’ status, and when statistics fall short, we can also remove the accreditation. We may decide on this course of action for several reasons. These could be related to concerns around the quality of the data sources used to produce the statistics, where user need is not being met or where substantial changes to the data sources and methods require us to conduct a review to ensure the quality of the data is such that they continue to be applicable for their intended use.  

As we highlighted in our 2020 assessment, the response rate for the LFS has been steadily declining, and we called on ONS to address this issue and share any relevant information with users. ONS continued to develop its plans for the TLFS, which aims to address the shortcomings of the LFS. 

The long-standing response rate challenges facing the LFS were exacerbated by the COVID-19 pandemic. These issues then became acute when the boost to enable pandemic operations was removed in July 2023. Following this, ONS had to suspend publication of its estimates of UK employment, unemployment and economic inactivity based on LFS data. 

We removed the accreditation from LFS-based estimates and datasets in November 2023. We have also removed the accreditation from other outputs based on data from the APS. The APS is based on responses to wave 1 and wave 5 of the LFS plus a boost sample. 

Monitoring ONS’s work to improve the LFS

When ONS reintroduced LFS-based labour market statistics in February 2024, we carried out a short review of these statistics. In August 2024, we carried out a follow-up review to check the progress made against the requirements set out in our initial report.  

We identified four outstanding requirements, which focus on: 

  • communicating updates on both the LFS and TLFS in one place that users can easily access 
  • improved communication around the uncertainty in the data and what this means for the use of these data 
  • the publication of more-detailed information about the principles and quality criteria that ONS will consider in making further LFS improvements and the transition to the TLFS 
  • the publication of more-detailed information about ONS’s plans for improving the LFS from now until the transition to the TLFS and for transitioning to the TLFS  

We will continue to closely monitor ONS’s work to improve the LFS and plan to report on progress against these requirements next year. 

Reviewing ONS’s work to transform the LFS 

Over the last few years, ONS has been developing a different version of the LFS using an online-first multimode approach (that is, online first, followed by telephone and use of face-to-face interviewers to encourage households to participate). 

Recognising the significance of these statistics for government and economic decision-making, rather than wait to review the final statistical outputs, we have carried out regulatory work throughout their development. The aim of this work is to share early regulatory insights to help ONS in ensuring the new survey meets the standards of Trustworthiness, Quality and Value, in line with the Code of Practice for Statistics.  

We have carried out our review in phases, with each focused on the most relevant elements of the Code. The aim is to assess the statistics produced from the survey against all parts of the Code once the transformed survey is fully implemented.  

The first phase (which started in April 2022) focused on the design and development work ONS had planned before transitioning to the new survey approach. We published our initial findings in November 2022. In July 2023, we published an updated letter and progress report following phase two of our review. We are now entering the third phase of this work, which focuses on ONS’s engagement with users, its communication about its planned work and how it is assessing the quality and readiness of its transformed survey. 

We plan to publish the outcome of phase three of our review of ONS’s LFS transformation in early 2025. 

What’s next? 

A key theme throughout our regulatory work on the LFS and TLFS has been the need for improved communication and clarity, specifically around plans, sources of uncertainty and quality criteria for the transformed LFS. We also called on ONS to identify what lessons can be learnt from the LFS to more effectively and transparently manage and pre-empt quality issues in the future. So, we were pleased to see the comprehensive communication of ONS’s plans and activities in its letter to the Treasury Select Committee and in the update it published on 3 December. We also welcome its engagement through the stakeholder panel and the external methodological input it has sought from Professor Ray Chambers and Professor James Brown.  

We recognise there is still more to do. We will continue urging ONS to provide regular, clear, and comprehensive updates for users, as well as to seek challenge and input from key users and experts to ensure the future production of statistics that meet user needs. 

We are also working to understand to what extent response issues are impacting other household surveys used across the statistics landscape. We have asked ONS to consider whether the issues, concerns, and lessons it has learnt from the LFS apply to its other surveys. 

We will also carry out our own lessons learnt review, focusing on how we can best apply the Code of Practice as a tool to support transformation activities.

The Power of Conversation

In our latest blog, Head of OSR Ed Humpherson discusses our consultation on a revised Code of Practice, which  is open for comments until 14 February 2025. Read more about the consultation and how to have your say here. 

I have been asking myself why it is only now that I am writing a blog on our consultation on a revised Code of Practice, several weeks after its launch.

The consultation is big news for OSR and for the UK statistical system: the Code is our foundational set of principles, our conceptual framework, our guiding light. And it’s not as if we are proposing some mere tidying-up measures, the sort of pruning and weeding that a good gardener does to maintain their garden. We are proposing some significant landscaping changes – particularly to the structure and presentation of the Code.

Perhaps the answer comes down to my observation that most endeavours in the world of statistical regulation depend on, and are enriched by, conversation. OSR’s best work – our most impactful reports and interventions – is effective because of our engagement and interaction with users of statistics, both expert and not, and with the people who produce the statistics.

To give two examples: first, our annual state of the statistics system report is not dreamt up by us in a meeting room; it builds on a whole series of conversations across the statistical system, both with users and producers. Second, our assessments of individual statistics draw heavily on engagement with users; take a look at our recent assessment of ONS’s Price Index of Private Rents to see this in action.

Launching a consultation is not an end point in itself. It is an invitation to other people to share their thoughts, reflections and criticisms.

Moreover, the part of the work I particularly enjoy is not the sense of achievement on the day of publication. It’s hearing all the subsequent reactions and comments, and joining in the discussions that ensue.

That’s why I was so happy last week to participate in a joint event between OSR and the Royal Statistical Society (RSS) to discuss the new proposed Code. We heard a range of interesting and thought-provoking reactions, such as those from Paul Allin, Honorary Officer of the RSS for National Statistics, on the importance of recognising the public role of statistics; from Ken Roy, an independent researcher and former head of profession in a government department, who highlighted that the Code is the glue that holds together the large and complex UK statistics system; and from Deana Leadbeter, Chair of the RSS Health Statistics User Group, who welcomed the ambition of a more digestible Code for a wider audience. And we had some excellent questions from the audience on topics ranging from the limits to trustworthiness (from a colleague in the Hungarian national statistical institute) to the importance of simplicity.

These productive conversations are why I’m looking forward to the debates and dialogues around the new Code in the coming months – including those with the Market Research Society and the Forum of Statistics User Groups.

I want to hear people’s reactions to the new Code, including their views on:

And I want to hear a wide range of other thoughts – not just about the things that we want to highlight, like those three bullets above – but the things we have missed.

This emphasis on engagement and conversation is not only a core value for OSR. It’s also central to the Code of Practice itself. The new Code that we are proposing sets even clearer and firmer requirements for statistics producers in how they should engage with their users and transparently communicate how they have produced their statistics, and what their statistics do (and don’t) mean.

So, back to the question at hand: why didn’t I write this blog until now? It’s this: for me, the day the consultation is published is not always the best day to publish a blog. Instead, it can be better to wait until we’ve started to hear some views. Or, to put it simply: communication shouldn’t be about broadcasting a fixed view. Instead, it’s all about the power of conversation.

Read more about the Code consultation and how to have your say here. 

[1] What is it with me and gardens? I used to do a presentation all about walled gardens – how official statistics can’t be a walled garden, pristine but closed off from the world. They need to be open and accessible. Now, as then, I reach for a garden metaphor. It can’t be that I use these gardening analogies because I myself am an adept and successful gardener. I mean, you should just look at my own garden to realise that.

 

The Power of Public Engagement in Shaping Our Work 

In our latest blog post, OSR Research Officer Nick discusses how public engagement shapes our work, specifically in relation to our research project about how individual members of the public use official statistics to make decisions that relate to their personal lives. This OSR research is delivered by the Policy Institute at King’s College London and Behavioural Insights Team and will be published in January (as discussed in a previous blog, “How do we use statistics in everyday life?”). 

Here at the Office for Statistics Regulation (OSR), we are increasingly recognising that involving the public in our work is crucial for ensuring statistics truly serve the public good. Because of that, we have begun exploring public engagement.  

Public engagement is important and impactful  

For us, public engagement is the process of involving and collaborating with those outside of government and statistics to share information, gather input and foster meaningful dialogue on issues that affect them. Public engagement is about more than just asking people for their opinions. It’s about making sure that the work we do is grounded in real experiences and needs. By involving the public, we can build on our own technical expertise to make sure our work is meaningful and useful to the very people we aim to serve. 

OSR has begun engaging the public on our research 

Recently, we spoke to a small group of people to find out what they thought of different aspects of our current research project exploring how people use statistics to make personal decisions. These discussions have shown us just how valuable public engagement can be in shaping our research to be more relevant and acceptable to the public. 

In our public engagement sessions, we invited members of the public to share their thoughts on different aspects of our research proposal. We met with them virtually multiple times over the course of the projects, so that we could tell them how their advice had helped us and continue to seek their views. This included discussing how they felt about the life decisions we wanted to explore in our research, such as choosing a school for a child. We also asked for their feedback on survey questions we plan to use in the next stage of our research. While this was a light-touch approach to public engagement (a small group that met online for only an hour each time), we still got a lot from these discussions.  

We found this approach very valuable 

Public engagement is a powerful tool that enriches our work and fosters a collaborative spirit between OSR and the public. While this is relatively new for OSR, our recent sessions have demonstrated the real value of this approach. From this experience of public engagement, we have three main reflections: 

  1. People wanted to be part of these discussions 
  2. Public contributors are full of useful suggestions 
  3. It is important to seek diverse perspectives 

People wanted to be part of these discussions  

Discussions about statistics and research have the potential to be dry. However, when we spoke to members of the public, they were enthusiastic about the research and appreciated the opportunity to contribute. For example, one of our public contributors said: 

“If someone said to me, you are going to be involved in some statistics research, it seems like whether I will be able to do that? As a lay member of  the public, maybe I don’t know?.. but it is easy… I would like to be involved in future opportunities.”

This quote shows how important it is to give people the opportunity to be involved in these types of discussions. 

Public contributors are full of useful suggestions 

Attendees provided valuable feedback on various ways to improve our research, which has already informed several key decisions in our project. For example, we originally planned to ask people about whether or not they used official statistics in choosing which school to send their child to. However, public contributors raised that not everyone actually gets to choose a school for their child: 

“You might be able to have a parent that will look and say this is the school I want my child to go to based on their needs, based on the statistics. But the actual chances of them getting that school is very slim. So there may be people that feel that, yes, I can look the statistics, but then I have an inability to choose.” 

Because of this advice, we changed the decision to be about which school to apply to, rather than which school to send a child to. This type of change helped our research be more relevant to members of the public. 

It is important to seek diverse perspectives  

Engaging with a diverse group of attendees allowed us to gather a wide range of viewpoints. For example, we heard from people whose first language was not English on how the statistics in the research could best be presented in a way that they could understand:  

“[Statistics are] in the English and I have language barrier, so how can I like utilise that… In the bar diagram it’s ‘OK this bar is high and this bar is low’ so it’s easy to understand.” 

This perspective led to sharing visual aids with participants in the first stage of our research rather than solely presenting a traditional statistical bulletin made up of prose. Doing so made our research more accessible to a broader audience, and allowed a wider range of participants to engage meaningfully. 

We plan to use public engagement more going forward 

The success of these public engagement sessions has reinforced our commitment to involving the public in OSR’s work. As part of this commitment, we are now part of the Public Engagement in Data Research Initiative (PEDRI). In addition, future research projects at OSR will build on the feedback received in this project, and we hope to undertake public engagement in key projects beyond research as well. In doing so, we will be aligning with our view that serving the public good means treating official statistics as public assets; this involves allowing the public to understand and engage with our work. Through public engagement about our work at OSR, we can ensure it is more transparent, trustworthy and responsive to the needs of the public. 

If you would like to learn more or even become a public contributor yourself, please contact us at research.function@statistics.gov.uk. We look forward to hearing from you.  

 

Just three words – the birth of a code

In our latest blog, Penny Babb, OSR’s Head of Policy and Standards, speaks about her experience with the Kenya National Bureau of Statistics in developing the Kenya Statistics Code of Practice

I am a self-confessed code-geek. I love talking about the Code of Practice for Statistics! I may be a tad boring at parties, but I have a passion for the Code that is hard to miss (and I hope is infectious).  

What better then, than to get together with a group of people, away from the distractions of work (or mostly), to talk about codes for a week! There truly was no escape for my audience as we met on the shores of Lake Naivasha in Kenya.  

It was such a privilege to work with the Director General and statisticians of the Kenya National Bureau of Statistics (KNBS), to guide them through the preparations of their own code of practice. This collaboration was part of the strategic partnership between the UK Statistics Authority (UKSA) and KNBS. It wasn’t about the UK Code, but an opportunity to share our experiences and to support KNBS as they wrote their first code – finding the message that they wished to share with their organisations and analysts across the statistical system to inspire good practice. KNBS is committed to ensuring official statistics remain a valuable resource for all stakeholders in the statistical system in Kenya and to demonstrating international best practice. It sees the code as being central to achieving these goals.  

One of the lessons I took from being involved in developing the second edition of the UK Code was the power of explaining it simply, in just three words. And as we have seen in our own recent review of the Code, our three words – trustworthiness, quality and value – are extremely effective in helping people producing and using official statistics to understand what matters. My goal was to help my Kenyan colleagues find their own three (or more!) words. 

The dedication and skill of the team of statisticians and leaders at KNBS was evident and central to completing their task of writing their code. The Kenyan code had to be one that was meaningful and inspiring in their context – not just to staff in KNBS, the national statistical institute for Kenya, but for all other organisations in their statistical system.  

In two week-long workshops, Caroline Wangeci (the then ONS strategic adviser to KNBS) and I led sessions to develop the Kenyan code of practice. Our codes are both grounded on the UN Fundamental Principles for Official Statistics, but each speaks to specific needs within our countries. 

The commitment and dedication of the Kenyan team was inspiring to witness and humbling to assist. Writing a code of practice to set standards for statistical production and dissemination is challenging. But doing so in concentrated bursts of effort, away from day-to-day responsibilities, allowed the team focused time to think through the key values and messages needed to inspire those involved in producing official statistics. After having spent a day diving deep into code matters, many of the participants then spent further hours catching up on their work, ensuring the smooth running of their statistical teams.  

KNBS has published the Kenya Statistics Code of Practice (KeSCoP). It is a remarkable achievement in its clarity and mission. As the Code highlights,  

“KeSCoP is anchored on three pillars, namely, Quality, Trustworthiness and Progressiveness.  Each pillar contains values and commitments that producers of statistics should commit to when producing and disseminating statistics.” [page 1, KeSCoP] 

KeSCoP is drawn from the principles and requirements of the Kenya Statistical Quality Assurance Framework (KeSQAF). The Code helps the drive for continuous improvements in statistical practice.  

“Compliance with KeSCoP gives producers of official statistics and the users confidence that published statistics are of high quality, have public trust and are produced by institutions and people that are progressive.” [page 2, KeSCoP] 

It echoes our UK Code but in a truly Kenyan way – they have defined their concepts and practices in the ways that speak to their application of the UN Fundamental Principles for Official Statistics.  

Quality has values for:

relevance; timeliness and punctuality; accuracy; coverage/comprehensiveness; methodological soundness; professional skills and competencies; and data sources

Trustworthiness has values for:

impartiality and equal access; confidentiality; accountability and transparency; and integrity and credibility  

Progressiveness has values for:

innovation; continuous improvement; user focus; and adequate funding

I love the clarity of each value and commitment, providing a clear steer for any producer on what it looks like to deliver official statistics to the required standard, with associated indicators or measures of success.  

We have much that we can learn from KeSCoP as we refresh the UK Code and seek to ensure its clear communication and accessibility. 

To find out more about UKSA’s international development work, see ONS’s published strategic plan for 2022–2025. 

 

The importance of separation: Ed Humpherson addresses questions raised by the Lievesley review

In our latest blog, Director General for OSR, Ed Humpherson, speaks about how OSR’s separation from ONS is communicated

Since Professor Denise Lievesley published her review of the UK Statistics Authority, most attention has been on the Statistics Assembly. The Lievesley review considered the work of the UK Statistics Authority, and its first recommendation was that the Authority should hold a Statistics Assembly every three years. The Assembly should elicit and explore the priorities for the statistical system with a wide range of users and stakeholders. And the first Assembly will present a fantastic opportunity to have a wide-ranging conversation about statistics and enable a discussion about the strengths and limitations of the current statistical landscape in the United Kingdom. So, it’s not surprising that this recommendation has generated a lot of interest. 

The Lievesley review raised other important issues. Some of these issues relate to OSR. In particular, she highlighted that OSR needs to improve how it communicates the separateness of its role from ONS.  

Separation matters to us. Indeed, when we look at official statistics, we start by considering the trustworthiness of the governance processes used by the statistics producers – by which we mean the mechanisms in place to ensure and demonstrate that the statistics are free from the vested interest of the organisation that produces them – that they are the best professional judgement of the statisticians. 

Similarly, it’s important that our decisions reflect our best professional judgement and that, in relation to ONS, we can make those judgements without giving any weight to ONS’s own organisational interests. 

We have several arrangements in place to secure our separation from ONS. But if people don’t appreciate or understand them, they are not working. And the Lievesley review made clear that we need to better explain the processes that assure this separation to our stakeholders and the public. That’s why today we are publishing a statement that sets out, in formal terms, the arrangements that underpin our separation from ONS. 

The key features of the statement are: 

  • a summary of the governance arrangements, under which we report our work to a separate Committee of the Authority Board made up only of non-executive members and me – that is, with no membership role for ONS staff. 
  • a summary of the arrangements for setting strategy and budget, which involve me and my team making our proposals directly to this Committee and to the Board, with no decision-making role for ONS staff 
  • a confirmation of my own personal reporting lines, which mean that I do not report to the National Statistician in any way, but directly to the Authority’s Chair; and that I have regular meetings with the Chair without the National Statistician or any senior ONS staff attending 

Please read the statement if you’d like to learn more about these arrangements. 

But I’ll close with Denise’s own words. The most important features of any governance set-up are the budget and the performance management. And on this, she was clear:  

The existence of a small number of misunderstandings by users also appear to perpetuate, such as that the Director General for Regulation is line managed by the National Statistician (he is not) or that the National Statistician controls the budget of the OSR (he does not). Nor does the National Statistician attend Regulation Committee meetings.” 

I hope the statement we are publishing today helps provide some reassurance and address the issues of perception identified by Denise’s review.  

Embedding the habit of intelligent transparency

In our latest blog, Director General for OSR, Ed Humpherson looks at the importance of intelligent transparency for Governments across the UK.

Intelligent transparency is one of the most important set of principles that we uphold. When statistics and quantitative claims are used in public debate, they should enhance understanding of the topics being debated and not be used in a way that has the potential to mislead.  To help those making use of data in public debate, we have set out our three underpinning principles of intelligent transparency. These principles demand that statistical claims and statements are based on data to which everyone has equal access, are clearly and transparently defined, and for which there is appropriate acknowledgement of any uncertainties and relevant context.

We have promoted intelligent transparency to the UK Government and the Governments of Scotland, Wales and Northern Ireland. And the Chair of the UK Statistics Authority, Sir Robert Chote, set it out clearly in his letter to political party leaders ahead of the 2024 general election. We have also made a number of interventions to support the principle of equal access.

 

Intelligent transparency in conference speeches

Equal access means that all statements involving statistics or data must be based on publicly available data, preferably the latest available official statistics where possible. Claims should not be made based on data to which ministers have privileged access, as this prevents the claims from being scrutinised and undermines confidence in official statistics.

We recognise that conference speeches by Ministers in particular can be difficult, as noted in this blog we published in September. Ministers want to highlight policy successes to their party members, but they do not have the input of the civil servants who would normally ensure that their statements use statistics and data appropriately.

In this context, we were recently made aware of a statement made by the Prime Minister, Sir Keir Starmer at the Labour Party Conference regarding immigration returns. The claim in question was that there has been “a 23 per cent increase in returns of people who have no right to be here, compared with last summer”. At the time the Prime Minister made this claim there was no Home Office data or statistics available in the public domain for the relevant time period to support this statement.

 

The importance of publishing data used in the public domain

Following the statement made by the Prime Minister, we engaged with the Home Office and we welcome the ad-hoc statistical release published by the Home Office, which provides the underlying data that relate to this statement. In most cases we would want to see the release of unpublished data very soon after the statement itself. But we do understand that sometimes, as in this case, providing the underlying data in a usable format may take longer. We consider that the approach taken by the Home Office is in line with the principles of intelligent transparency. It is a good example of how to resolve a situation when unpublished information has found its way into a speech.

 

Working with Government to ensure best practice

 We are using this example as an opportunity to re-emphasise the importance of intelligent transparency to a range of audiences, including Heads of Profession for Statistics, the wider Government Analysis Function, and the Government Communications Service. We have also discussed the issue with officials in Number 10, who have confirmed the importance they attach to appropriate use of data. We are happy to provide support and advice to departments on the principles of intelligent transparency for analysts, communication professions, special advisers, and any other colleagues who may benefit. To discuss this with us please contact us via regulation@statistics.gov.uk.

For all Governments, it is important to avoid the use of unpublished information in the public domain. We are always happy to work with officials and advisers to embed good habits of intelligent transparency as fully as possible.