The journey to improving income-based poverty statistics

The Office for Statistics Regulation (OSR) recognises the important role that organisations play, both within and outside of Government, in seeking to understand poverty through data and statistics. In response to our ‘income-based poverty’ review published in 2021, in our latest guest blog, Ainslie Woods, the Income and Earnings Coherence Lead at the Office for National Statistics (ONS) discusses how government statistic producers are working together to improve poverty statistics.

For more information about our regulatory work please sign up for our newsletter.

With the rising cost-of-living in the UK, there is an ever-growing public and policy interest in the effect this is having on poverty levels. Poverty is a term commonly used by the media, politicians, policy makers and the public – but what does it really mean and how do we measure it?

My role, as the Income and Earnings Coherence Lead, is to work with official producers from the Department for Work and Pensions (DWP), the Office for National Statistics (ONS) and HM Revenue and Customs (HMRC) to improve the coherence and accessibility of our statistics. Following OSR’s 2021 income-based poverty statistics review and related blog, the trouble with measuring poverty, we’ve been working with statistical producers to help build the bigger picture on poverty in the UK – so what has been done?

What do we mean by poverty?

Poverty can be defined in terms of household disposable income, which can be used to identify those on low income, commonly referred to as income-based poverty statistics. The statistics are one of many factors used to inform key policy decisions such as the recent increase to the National Living Wage.

Poverty, as defined in terms of disposable household income, is commonly measured using two approaches:

  • people in relative low income (households with less than 60% of median income)
  • people in absolute low income (households with less than 60% of the median income in 2010/11, held constant in real terms).

These approaches can both be measured before housing costs and after housing costs. The statistics are published annually by DWP in its Households Below Average Income (HBAI) publication.

Although low income is an important aspect of poverty, there are other facets of poverty too. The HBAI publication also includes data on ‘material deprivation’, which provides an indication of people’s ability to access or afford a range of everyday goods and services.

Measures of persistent low income (used as a measure of persistent poverty) are available in DWP’s annual Income Dynamics publication. It is widely agreed that the impact of long-term poverty on individuals is worse than when poverty is experienced only for a short time, therefore these statistics provide important additional information to the HBAI release.

The ONS has also historically produced income-based poverty statistics. Prior to the UK’s exit from the EU, this was mainly through Eurostat. The ONS is exploring the user need for these statistics with a view to re-introducing poverty statistics, possibly within an annual financial wellbeing publication. User views on the future of ONS poverty statistics are welcomed as part of the transforming the ONS’s household financial statistics consultation which closes on 23 February 2023. ONS continues to engage with a wide range of stakeholders on the use of poverty statistics, including the Social Metrics Commission (SMC) following its September 2018 report ‘A New Measure of Poverty in the UK’.

Improvements to our statistics

In late 2020, representatives from ONS, DWP and HMRC came together to form our cross-Government Income and Earnings Coherence Steering Group which provides the overarching direction, insight, and leadership needed to deliver improvements. We want to ensure we are producing high quality data and analysis to inform the UK, improve lives and build the future.

In response to OSR’s review, we have implemented a range of improvements, including;

Longer-term work is also progressing well as we continue to review methods and work to maximise the use of administrative data, including;

  • DWP’s existing long-term work to develop integrated survey-administrative datasets (see section 2.5 of DWP’s statistical work programme).
  • ONS’s transformation of its household financial statistics (including household income, expenditure and wealth). By combining current surveys into a single survey, in conjunction with alternative data sources, it will be possible to deliver higher quality, more timely and in-depth analysis of households’ financial well-being. The transforming the ONS’s household financial statistics consultation (and associated blog) closes on 23 February 2023. A consultation response will be published in Spring 2023.
  • ONS’s research on the potential use of administrative data to produce social statistics for a range of population characteristics, including ethnicity and income. Experimental admin-based income statistics (ABIS) provide a useful early prototype that demonstrates how administrative data sources can be used to measure occupied address (household) income. ONS has also started to explore the potential for administrative data to produce a measure of income by ethnicity by combining two admin-based datasets. ONS is in the early stages of exploring this, but it has released an initial case study on producing admin-based income by ethnicity statistics (ABIES) for England.

As part of our follow-up engagement with OSR, I have been regularly liaising with Vicky Stone, OSR’s Labour Market and Welfare Lead and it is very encouraging to hear that OSR are pleased with the improvements made so far.

Where next?

As the rise in the cost-of-living continues, more and more emphasis will be placed on these statistics by Government and decision makers. It is my role, in conjunction with relevant producers, to ensure that collaborative cross-Government work continues, and I look forward to continuing to work with users and the OSR. While many of the statistics are UK wide, statistical producers across the UK will continue to work closely with the Devolved Governments of the UK to understand their needs and priorities. For more information on our planned work and progress please see our collaborative plan which was updated on 18 January 2023.

Analyse your potential: Internship and placement opportunities at OSR

There are many internship and student placement opportunities available throughout the Civil Service, and in the Office for Statistics Regulation (OSR) we try to give our interns a comprehensive, rounded experience to add to their skills and development, while ensuring they are treated as part of the team. In our latest blog, we talk to some recent placement students and interns about their experiences working at OSR.

Izzy – Government Statistical Service (GSS) Sandwich Year Placement, now working at HM Treasury

I did a sandwich year placement at OSR, between studying Philosophy, Politics and Economics at the University of Leeds. The world of statistics regulation was not one I had been previously acquainted with, so I was very much going in blind.

I think one of the most eye-opening things I learnt at OSR was just how easy it is for statistics to be misused and misinterpreted! I came to understand that there is so much more to a statistic than just the number in front of you, and how important it is to be critical in your interpretation of data and evidence. This was a really useful lesson to learn, and definitely something I’ve taken forward in my final year of university, and beyond.

My placement highlight was definitely playing a leading role in OSR’s systemic review of poverty statistics. I got to work with departments across Whitehall, as well as interview a range of stakeholders including leading charities, think tanks and researchers. As a university student, it was extremely cool to be speaking to people whose papers I had spent my degree reading! It was also fascinating to learn about the complexities of measuring poverty, and to make a meaningful contribution towards improving how this is done within a space I care a lot about.

Since finishing university, I’ve started working in HM Treasury as a Policy Advisor specialising in economic risks. In my short time there, I’ve already found my year at OSR to have come in incredibly handy. I’ve learnt to engage much more critically with statistics and data when I use them in my work, which, as a policy advisor, is a really vital skill. I’ve also managed to hold my own in quite a few heated debates on inflation measurement, which is definitely not something I could have said before…!

I think statistics are hugely important for allowing the public to hold government to account for the decisions it takes. They enable people to understand and assess the motivations behind policy decisions that have a direct impact on their lives – as well as how effective those decisions ultimately are. In that way, statistics are a really crucial link between politics and the public – which is why OSR has such an important role to play in making that process as transparent as possible.

Ewan – The Government Economic Service Sandwich Student Placement Scheme

I’m currently part-way through my placement year working at OSR, between my studies in Economics at the University of Bath. I’ve been working in the Economy and Business, Trade and International Development domains.

My highlight of this year had to be working on casework. There is no specific casework that comes to mind, but I compare it to being a statistics detective. Hunting round, trying to find a claim’s sources never loses appeal. Furthermore, it is great feeling to find the smoking gun – the source or figure that they are claiming. Lastly, it feels like it has a direct impact. By regulating the use of statistics, OSR builds confidence in the statistics used in the public domain.

I wasn’t aware of OSR before my placement started, but I quickly understood how important it was that both the production and the use of statistics are kept to a high standard. It’s important to consider how statistics are used and the potential damage from their misuse.

The world is complicated, and people aren’t omniscient. Statistics are a way of depicting vast amounts of information in a clear understandable way. But statistics aren’t simply nice little descriptions of the world to be quoted at pubs. Statistics can make or break public policy proposals. The public should have as much information as possible to make informed decisions that best shape our future. This requires accurate and timely statistics which the public can trust. Furthermore, statistics are not just used for public policy as statistics are intertwined with our daily life. People use statistics to decide where their kids go to school or whether to stop smoking. Therefore, it is important that statistics are produced and presented to a high standard.

Following my placement, I will go back to university for my final year of my degree. From there, I plan to apply for the Government Economic Service fast track scheme.  I’ve really enjoyed my time at OSR – everyone has made me feel so welcome, and every day has felt different. The work varies quite substantially, and I rarely repeat a task. There is always something new to do and work rarely feels monotonous.

OSR has helped me develop as a person, both professionally and academically. I have no doubts that the skills I have obtained here will benefit me greatly in my academic and professional career. OSR has highlighted to me that I want to work within the public domain and have a positive impact on the lives of the general public.

Martin – The Summer Diversity Internship Programme (SDIP) (now the Summer Internship Programme (SIP))

Working at OSR was a great way to see first-hand how statistics are regulated. I gained an appreciation for how useful statistics are for users and the importance that they are trustworthy, made with quality and provide public value. I was involved in many projects whilst at OSR and whilst the work was challenging, I found it incredibly rewarding.

At first, I was nervous about working in an office environment, especially one focussed on statistics, as I graduated in film and philosophy. However, I quickly found out that OSR also provides roles that are more analytical than number-heavy, which I was very happy about. On arrival to OSR I was assigned a great line-manager, Grace, who inducted me into the organisation and introduced me to the team. I was very lucky as Grace was glad to answer any questions I had. I was also assigned a buddy, Ewan, who helped me settle in and answer the questions I had.

I completed SMART (Specific, Measurable, Achievable, Relevant, and Time-Bound) targets during my time at OSR. For example, I compiled a data evidence list in which I assessed and recorded whether sources contained specific information. This was an interesting task as it allowed me to see what information each source contained and built my experience in using office tools. I also constructed a survey on development and wellbeing within the department, which allowed me to gain an understanding of the thoughts and feelings of those working in OSR and assess how this could be improved. These tasks were useful as they provided me with transferable skills such as ICT use, project management and analytical skills.

Working at OSR also offers opportunities to work with other government departments. After asking my line-manger if I could do some work within the Ministry of Justice (MOJ), I was given an opportunity to work with Ben and Job from OSR’s crime and justice domain, on a compliance check of MOJ’s statistics on Women and the Criminal Justice System and an assessment of Scottish prison population statistics. I found this work very interesting and am very grateful to have been given an opportunity to work on them.

Even though SDIP is online-based, I was able to visit the Newport office and both London offices, which gave me the opportunity to meet some of my colleagues in person and experience an immersive day-in-the-life on the job.

Overall, I would say the experience was highly positive as I feel I have improved my skillset since starting at OSR. Everyone being kind and friendly has also made it an experience that I won’t forget.

OSR is always keen to hear people’s views on statistics and how they are used. To get in touch with us, or just to stay up to date with our work, follow us on Twitter and LinkedIn, and sign up for our monthly newsletter.

A day in the life of a Regulator

In our latest blog, Job de Roij, our lead regulator in the Crime and Security team, based in our Edinburgh office, demystifies the job of a statistics regulator.

Find out how to apply for our current Statistics Regulator vacancies below

Statistics regulator. It’s quite a strange-sounding job. If you asked a random person on the street what they think a statistics regulator does, they would probably say that it’s someone who regulates statistics. But if you asked them what this means, they would probably draw a blank. Fair enough – most people aren’t familiar with the idea of regulating statistics. 

So, what does a statistics regulator do? A wide range of things! There is no such thing as a typical week – what we do can vary from week to week – but there are some common activities. The journal below gives you a flavour of the kinds of things we do. I lead the Crime and Security team, which covers all statistics published by official bodies on crime, policing, justice and defence, so all the projects I mention are related to that area.   


In the morning, I checked our information dashboard, an internal tool developed by our Data and Methods team. It scrapes data from government statistics release calendars, news websites, Twitter, and Hansard, which helps us monitor the release and use of official statistics. I came across several interesting media articles and Twitter threads about hate crime data. I reviewed what they say and decided that there’s nothing I need to investigate further. 

In the afternoon, I spoke to a couple of users – an academic and a think tank – of Home Office’s police officer uplift statistics, which report on progress with the recruitment of an additional 20,000 police officers in England and Wales. We recently assessed these statistics against our Code of Practice for Statistics (the Code). Assessment is one the main tools we use to review whether statistics are meeting the standards of trustworthiness, quality and value set out in the Code. Speaking to users is a really important part of assessments. It helps us understand what the statistics are used for and what users think of their trustworthiness, quality and value, to identify where improvements can be made. 


In the morning, Kirsty, our Head of Private Office and Casework Manager, emailed me about a piece of casework I’ve been working on. Casework is what we call our work investigating issues raised with OSR. We have an interventions policy that sets out when and how we intervene in the use of statistics. We can raise casework ourselves internally or members of the of the public can raise casework with us, which could be anything from something which was identified as misleading in a government report, to something a Member of Parliament (MP) said on Twitter. For the case I worked on today, I investigated the concerns raised, prepared a briefing with a summary of the issues, analysis of the statements and statistics, and recommendations for what we should do next.  


My morning started with two back-to-back team meetings: an Edinburgh site catch-up and an organisation-wide “Cascade” with colleagues in the Edinburgh, London and Newport sites (and those based elsewhere). It’s a chance to hear about what’s happening across OSR and to share any important updates. In today’s meeting I gave an update about the Crime and Security team’s work, heard about work in our eight other domains, and heard about next year’s business planning from our Senior Leadership Team.  

Every other Wednesday afternoon we have team learning sessions. These are mostly used to share knowledge across the team, but we also use them to discuss ways of improving how we work. Today is a ‘Code Case Learning’ session, where two colleagues presented some regulatory work that they’ve been involved in covering a specific principle of the Code of Practice for Statistics. The idea is to help regulators develop their understanding of aspects of the Code and hone their judgement in applying the Code.      


In the morning I met with the Deputy Head of Profession for Statistics and Head of Publications at the Ministry of Defence (MOD). We discussed new developments to their statistics and an idea I have for a short review of a set of MOD statistics. As regulators, we meet regularly with statisticians in different departments to find out what is happening with their statistics and how we can best support them to meet the standards of the Code. 

I spent the afternoon drafting a letter to the Department of Justice in Northern Ireland setting out the results of our compliance check of their prosecution and conviction statistics. Compliance checks are another tool we use to review statistics against the Code. They’re shorter and less detailed than assessments, which allows us to review a wide range of statistics every year.       


Friday is my non-working day. OSR and the Civil Service offer flexible working arrangements, which is great for maintaining a work-life balance. 

On Fridays, our Director General, Ed Humpherson, sends a weekly update to the whole team reflecting on the activities and events of the week. It’s great to hear about Ed’s external meetings and the impact of our work is having. It’s a good reminder of the difference that regulators can make!  

Would you like to work with us?

We currently have a vacancy for three Statistics Regulators to join our team (apply before 11:55 pm on Tuesday 24th January 2023).

We regularly run webinar support sessions for anyone who would like to know more about applying for a job with us. You can find details on our Eventbrite page. Our next session is on the 17 January 2023.

What 2023 means for OSR

In our latest blog, Director General Ed Humpherson reflects upon the past year and sets out OSR’s priorities for the coming year. We are inviting feedback on these in order to develop our final 2023/24 Business Plan. If you have views on what the key issues in statistics are, please email 

As the days shorten and the end of the year looms, in OSR we start turning our attention to the next financial year, starting in April 2023: to what we will focus on, what we want to achieve, what our ambitions are.  

This is a reflective and deliberative process: we don’t finalise our business plan for the 2023-24 year until April itself. And it’s also a consultative process: we want to hear from stakeholders about where you think we should focus our attention. 

How we develop our priorities

There are a range of inputs into our thinking. We start from our five year strategic plan, which sets out broadly where we want the statistical system to be by 2025, and how we will help the system get there. We form a judgement as to how close the system is to achieving this vision, and what we should focus on to help support it. 

We also draw on our annual state of the statistical system report (most recently published in July) on what it is telling us about what positive things in statistics we want to nurture and what challenges we want to address. And we also take stock of what we’ve achieved over the last 12 months: whether, in effect, we’ve successfully delivered last year’s priorities. 

But there are two further aspects to our business plan that are crucial. First, we can’t do everything. We are a small organisation, and necessarily we have to make prioritisation choices about where we can most effectively support the public good of statistics. And second, we can’t plan for everything. We have to respond to those areas that are salient in public debate, where statistics are helping or hindering public understanding. And we can’t always predict very well what those issues might be. To take an obvious example, our planning for 2020-21, taking place around this time three years ago, did not anticipate that we’d spend most of 2020-21 looking at statistics and data related to the pandemic. 

To help us make these choices, and to be better at anticipating what might be just over the horizon, we would like the input, advice and creativity of our stakeholders: of people who care about statistics and data, and who want to help us be as effective as we can be. 

Our 23/24 priorities

Today I am pleased to share with you our draft priorities for 2023/24. These are deliberately high level. We have not chosen our menu, selected our ingredients, and already got the cake ready for the oven. We want to leave lots of room for people outside OSR to make suggestions and raise questions. 

Our plan for next year has four high level priorities, all focused on different aspects of supporting change and transformation: 

  • Support and challenge producers to innovate, collaborate and build resilience 
  • Champion the effective communication of statistics to support society’s key information needs 
  • Continue to widen our reach beyond official statistics and official statistics producers 
  • Increase our capability as a regulator 

The keen observers among you might note that these are an evolution of last year’s priorities, rather than a wholesale change. We have had a good year, for sure; but as always, we will always strive to do more.  

Please get in contact with us at to let us know your thoughts and questions about these priorities, or if you would like a wider discussion with us.

How to communicate uncertainty in statistics

Over the past year we have been hosting online seminars for analysts in government, covering a range of topics including how to communicate uncertainty in statistics. Following the publication of our insight report, Approaches to presenting uncertainty in the statistical system, Mark Pont explores the themes discussed in the report and answers some of the questions from the event.

To stay informed of future events and our work in this area, sign up for the OSR monthly newsletter.


Why should you communicate uncertainty in statistics?

Uncertainty in statistics is important to acknowledge, but it’s not always easy to communicate, and not always presented well (or at all!) There is some uncertainty in most statistics and acknowledging it shouldn’t be viewed as negative. Openness helps build trust (as found by recent work by the ONS Economic Statistics Centre of Excellence), and being clear on uncertainty helps users. If there was one overall improvement I’d suggest it would simply be for those statistical outputs that don’t mention uncertainty at all, to do so. This applies to the range of different outputs produced by the analytical community, including internal analysis that is provided internally within organisations, for example as part of policy development.

How to communicate uncertainty effectively

There are different ways of communicating uncertainty and the approach you take depends on the context – of the data, of the statistics, and most importantly an understanding of how the statistics might be used, and by whom.

We have found mixed practice of presenting uncertainty in statistics across government. We found some good examples. But also, a range of cases where uncertainty isn’t mentioned at all, and the statistics therefore appear to be precise facts. This is particularly so in detailed tables – which many users will access directly to get their data, therefore bypassing any information about uncertainty presented in statistical bulletins or separate quality documents. An important part of the statistician’s role is to enable all data to be used appropriately, and in particular not used beyond the weight they can bear. This means that the help for users around data tables needs to be readily available.

Our Approaches to presenting uncertainty in the statistical system publication highlighted some examples of good practice in communicating uncertainty effectively:

What are the issues with releasing detailed breakdowns of data?

One of the questions that we most often consider is the conflict between wanting to provide as much data as possible to users, particularly the kinds of breakdowns provided in detailed tables that may be based on very small samples, while recognising that (particularly for detailed breakdowns) they may not be very reliable. Even with appropriate contextual information about uncertainty, analysts should consider whether such detailed data are sufficiently good to support the uses to which they may be put.

There is always a balance to be struck. Our view is that using words like “estimate” and “around” are a simple way to help show users that the statistics aren’t perfect. As mentioned earlier, this humility from analysts helps breed trust, and is a positive thing. On the other hand, we would not want the valuable messages in the statistics to be overwhelmed by information about their quality.

Factors for success when communicating uncertainty

Good communication of uncertainty relies on various factors. At the forefront of this is that statisticians need to know their users and understand how the stats might be used by others. They also need to understand the impact on decision making of users over-interpreting the precision of any estimates. They also need to understand how well equipped their users are to understand uncertainty and to use it in their particular context. An effective approach to communicating uncertainty can then be determined.

Preventing deliberate misuse is a particular problem, however well uncertainty is communicated. But prominent advice on what the statistics do and do not say makes misuse more open to challenge. The Code of Practice for Statistics requires statistical producers to challenge inappropriate use. OSR may also be able to intervene, so do please engage with us if you need support.

The media play a vital role in onward communication and interpretation of statistics. There’s clearly a role for statisticians and their comms teams to work together to help the media understand and reflect uncertainty appropriately in their reporting. Comms teams can help bring new insights into the user experience. Ensuring that uncertainty (through the use of words like “estimate”) is communicated through media briefings as well as statistical bulletins, is also important.

Again, clear lines on what the statistics do and do not say are helpful and can be directly lifted into onward reporting. However, we’re still very conscious that small apparent differences in outcomes (for example, whether GDP growth is slightly positive or slightly negative) can lead to qualitatively different headlines. We plan to think more about what we can do in this space.

Some uncertainty can be quantified, for example through sampling errors or by modelling errors. But much uncertainty (for example non-sampling biases in surveys or estimates resulting from complex statistical methods) can be difficult or impossible to quantify accurately or at all. In those situations, to help users decide on the usefulness of your estimates, an approximation of their uncertainty would be sufficient.

It’s worth remembering that some users may struggle to understand confidence intervals or need additional guidance to help with their interpretation. Describing uncertainty in a narrative form or visually can help to ensure accessibility to a wide range of users. This also gives a great opportunity to bring together information about confidence intervals alongside factors such as your underlying confidence in the evidence base itself.

What other resources are available?

To end, it is probably worth reiterating the common thread that holds this blog together. Being able to communicate uncertainty well requires an understanding of the use and potential use of the statistics, and the potential harms of their use without an appropriate understanding of uncertainty. The question of uncertainty in data applies across the Analysis Function to all who publish information, whether in numerical form or not, or provide such information to colleagues within government.

Related links

Communicating uncertainty in statistics

Approaches to presenting uncertainty in the statistical system




Smart statistics: what can the Code tell us about working efficiently

Helen Miller-Bakewell, Head of Development and Impact at OSR, explores ways the Code of Practice for Statistics can assist producers of statistics and analysis increase their efficiency while facing pressure on resources

Most of us working in government (or indeed beyond government) will be familiar with the feeling that we’d like more resource, be it time, money, people, or all three! 

In July, we published our view on the current state of the UK’s statistical system. This year’s report highlights the tremendous amount of insightful and influential statistics produced by government analysts, but also explores the challenges producers of statistics are increasingly facing from pressure on resources.  

Addressing pressure on resources is a complex problem, which will require a multifaceted solution. But it raises a very closely related question: how can producers of statistics and analysis work in an efficient way, to ensure they achieve the maximum value from the resources they do have? It is this question we will consider here. 

As ever, when faced with a question about the production of statistics or analysis, we first look to our Code of Practice for Statistics. Three principles within the Code speak immediately to this question: relevance to users; innovation and improvement; and efficiency and proportionality. Below, we outline ways to support efficiency in these areas; however, we are very conscious that, when facing pressure on resources, it can be hard to initiate and implement them. To this end, we hope the case studies and links to available support and guidance can help. Please get in touch if you would like further advice or support. 

Relevance to users (V1):

Users of statistics and data should always be at the centre of statistical production; their needs should be understood, their views sought and acted on, and their use of statistics supported. We encourage producers of statistics to have conversations with a wide range of users to identify where statistics can be ceased, or reduced in frequency or detail, to save resources if appropriate. This can free up resource, while helping producers to fulfil their commitment to producing statistics of public value that meet user needs. Ofsted has recently done this to great effect. 

While effective user engagement itself takes time and expertise, this investment is key to ensuring resources are well-spent elsewhere. Undertaking public engagement collaboratively wherever possible, including working in partnership with policy makers and other statistics producers can reduce the resource required. The Analysis Function User engagement strategy for statistics has a strong focus on collaboration and how this will be supported across the statistical system in the future, including through the User Support and Engagement Resource (USER) Hub and theme-based user groups and forums. 

Innovation and improvement (V4):

The UK statistical system should maintain the brilliant responsive and proactive approach we have seen in the last few years and look to do this in a sustainable way. Improvements to data infrastructure, processes, and systems could all help. For example, the use of technology and data science principles, such as that set out in our 2021 Reproducible Analytical Pipeline (RAP) review, supports the more efficient and sustainable delivery of statistics. This review includes several case studies of producers using RAP principles to reduce manual effort and save time, alongside other benefits. The recent Analysis Function Reproducible Analytical Pipelines (RAP) strategy sets out the ambition to embed RAP across government, and the Analysis Function can offer RAP support, through its online pages, its Analysis Standards and Pipelines Team and via the cross-government the RAP champion network. 

Efficiency and proportionality (V5):

Statistics and data should be published in forms that enable their reuse, and opportunities for data sharing, data linkage, cross-analysis of sources, and the reuse of data should be acted on. The visualisations and insights generated by individuals, from outside the statistical system, using easily downloadable data from the COVID-19 dashboard nicely demonstrate the benefits of making data available for others to do their own analysis, which can add value without additional resource from producers. Promoting data sharing and linkage, in a secure way, is one of OSR’s priorities and we are currently engaging with key stakeholders involved in data to gather examples of good practice, and to better understand the current barriers to sharing and linking. This will be used to champion successes, support positive change, and provide opportunities for learning to be shared.  

When we reflect on these three principles, three further common principles and or themes become apparent to ensure their success: independent decision making and leadership, in particular Chief Statisticians and Heads of Profession for Statistics having authority to uphold and advocate the standards of the Code, professional capability – once more demonstrating the benefit of investing in training and skills, even when resources are scarce – and collaboration.  

All the principles listed above are supported by case studies in our online Code. These, along with case studies in our reports, can offer inspiration and practical suggestions to help analysts implement the ideas discussed. We are always delighted to discover new case studies that we can share to inspire others: if you can offer a case study, please do get in touch.  

Pressure on resources poses a significant threat to the ability of government analysts to produce the insight government and the wider population needs to make well-informed decisions. Working in an efficient way will help address one part of this problem: it will help ensure maximum value is achieved with the resources that are available, which will in turn help others across government appreciate the benefit of having analysts at the table.  

If you would like to discuss any of the themes raised here, or offer a case study that could help support smarter working among other producers of analysis, please contact us on 

Power in numbers: A collaborative approach to exploring public perceptions of public good

Dr Mary Cowan, Research Specialist at OSR, and Shayda Kashef, Public Engagement Manager at ADR UK describe the motivations behind the recent report on public perceptions of the public good and discuss the benefits of research collaborations

Many organisations working in or around data are driven by research that can help to answer some of society’s most pressing questions, with the ultimate aim of serving the public good. ‘Public good’ is a phrase commonly used within the context of data and statistics; as people working in this space, we have an understanding of this phrase, but what do the public think this means?

This is an important question as the data we use for research and statistics either comes directly or indirectly from the public. So, this year, we worked in partnership to shed some light on this topic with the aim of developing a resource for others looking for similar answers.

In the Office for Statistics Regulation (OSR), we have a vision that statistics will serve the public good. To inform our assessments of whether statistics are serving the public good, we engage with users of statistics to gauge their opinion on how well their needs are being served. Armed with that information, and the pillars of our Code of Practice for Statistics, we then work with statistics producers to help them realise the full benefits of their statistics.

Similarly, at ADR UK our mission is to harness the potential of administrative data for research in the public interest. Administrative data is the public’s data: therefore, in addition to making sure this data is used ethically and responsibly, we have a duty to engage the public in how and why their data is used at every stage of our work. This is to ensure our work demonstrates trustworthiness and maximises the public benefit of administrative data research.

At ADR UK and OSR, we rarely engage with people who do not use data or statistics. They often don’t have a reason to engage with us or perhaps even know we exist. But we believe the views of the general public, on what the public good use of data for research and statistics means for them, is integral to achieving our missions.

For these reasons, we sought to engage with members of the public who had little or no formal knowledge of data or statistics. We recruited 68 participants from across the UK, in person and online, whose diverse, and sometimes contradictory, perspectives enabled us to interrogate our own understandings and practices in a new way. This is why this project provides so many important insights for us and others who work with data and statistics.

We attended every workshop and heard our participants speak eloquently with passion and interest as they articulated their views on public good, and their ideas for what data and statistics could be achieving for society.

While many people struggle to define ‘the public good’, considering it a vague term, our participants guided each other in thoughtful discussions where they teased out their ideas about what public good relates to. Ideas that emerged from these discussions included:

  • the public should be involved in decision making to maintain neutrality and avoid politicisation
  • data for research and statistics should aim to address inequalities
  • the public good use of data for research and statistics should be clearly communicated and should minimise harm
  • and best practice safeguarding frameworks should be universally applied for data sharing.

You can hear directly from our participants in the expansions of these findings in the full report. 

This was the first time both of our organisations collaborated on a large-scale research project. Public engagement can be resource-intensive but, by working together, we have achieved a milestone for both organisations which may have been impossible without the other’s support. Both organisations are heavily invested in serving the public good, and by working together it allowed us to examine our findings from different perspectives and consider the implications from both a data and a statistics perspective.

Publishing this report may signal the end of the project, but it is also another important step toward understanding how data for research and statistics serves the public good.

Communicating uncertainty in statistics

In our latest blog, Assessment Programme Lead, Mark Pont, discusses the importance of statisticians understanding and communicating uncertainty in their statistics, in light of our recent report exploring approaches to presenting uncertainty in the statistical system.

Uncertainty exists in so many aspects of life and taking it into account is an important part of the decisions we make.

I recently had a hernia repaired at a hospital about an hour from where I live. Ahead of admission the hospital gave me an approximate discharge time. I needed to make plans to get home, which revolved around whether it made sense for my wife to drop me off then spend the afternoon mooching around art galleries, parks and shops. So, I needed to understand how accurate the estimate was, and what factors and assumptions might affect its accuracy. It turned out that it depended on things like where in the order for that day’s surgery I ended up, and how the surgery and my immediate recovery went. All of this was useful intel for our planning.

Later (after the op) I needed a more accurate estimate. My wife was (as planned!) mooching around art galleries, parks and shops, and we needed to try to coordinate her getting back to the hospital close to my discharge so that neither of us was left waiting around too long.

Taking uncertainty into account is also necessary when using official statistics. People who make decisions based on statistics need to factor the uncertainties around the statistics into their decision making. It’s not great to develop policy based on an assumption about the accuracy of the statistics that turns out not to be true. Statistics are rarely, if ever, absolute facts. There will always be some inherent uncertainty ­– from partial data collection in samples, delays in updating administrative databases, and so on. And different users may want different information about uncertainty depending on the nature of the decisions they’re faced with making and their level of expertise.

Our first Insight project considers the way that uncertainty in official statistics is communicated. We found a mixed bag of practice.

There are many cases where uncertainty is presented in some form in statistical bulletins – in the narrative, charts and infographics. Good examples include using words like “estimate” within the narrative, inclusion of error bounds in charts and clear lists of ways that the statistics can and can’t be used. Projections too often include variants, which gives a neat way of showing that the effect of different assumptions.

There are occasions though where estimates are presented as though they are absolute facts. Not acknowledging that uncertainty could exist within them could lead users to false conclusions. There are also times where better descriptions are needed to help users take uncertainty and its effects into account appropriately. Phrases like “care needs to be taken” and “caution is needed” are widely used, but they could be more specifically helpful in guiding appropriate use of the statistics.

We also found that the communication of uncertainty in detailed data tables (particularly where they are user-specified) is less well-developed, not least because describing uncertainty succinctly in these situations isn’t easy.

There is, however, an abundance of guidance and support available for analysts to help them think through uncertainty and how best to present it to users. We at OSR will continue to help improve and socialise that guidance. We will also develop our regulatory work to understand more about what effective communication of uncertainty looks like, and to encourage the spreading of good practice across government data outputs. We especially expect to develop our thinking on how the change in quality and uncertainty over time can be most helpfully communicated.

And for those who have made it this far, the surgery went well, I’m fully recovered and my wife enjoyed her afternoon out.

Related links:

Approaches to presenting uncertainty in the statistical system

Weights and measures: how consultations relate to OSR’s role

In our latest blog, Director General Ed Humpherson responds to concerns raised with OSR regarding the recent consultation by Department for Business, Energy & Industrial Strategy on the Choice on units of measurements: marking and sales.

This blog was amended on 4 October 2022 to provide further clarity about question design

Since the start of the Government’s consultation on weights and measures, a lot of people have raised concerns about it with us at OSR. At the time of writing, we have received over 150 emails and many others have tagged us on Twitter regarding the recent consultation by the Department for Business, Energy & Industrial Strategy on the Choice on units of measurements: marking and sales that closed on 26 August 2022.

Before considering the specific question that has been raised with us, let’s first set out some background to how consultations relate to our role.

Consultations, done well, are a vital tool for government in developing policy. They can provide specific and qualitative insight to complement evidence from other sources like official statistics. They also help Government to understand different perspectives, foresee possible consequences and gather expert advice on implementation.

Our remit focuses on statistics. We set the standards for how Government collects and uses statistics. Consultations are relevant to our work in two senses. First, consultations can often draw on statistics, for example to illustrate the scale of an issue a policy intends to address. And consultations can also create statistical evidence – for example, the number of people responding to a particular question. In these senses, consultations are relevant to our work as OSR.

Turning to this particular consultation, the aim was to identify how the Government can give more choice to businesses and consumers over the units of measurement they use for trade, while ensuring that measurement information remains accurate.

However, when the consultation was launched, many stakeholders raised concerns surrounding the consultation questions and in particular question 3a. The question was as follows:

If you had a choice, would you want to purchase items (i) in imperial units? or (ii) in imperial units alongside a metric equivalent.

There was no option to select an outcome without imperial units at all. People felt that this was a potentially biased way of collecting information on public views on changes to measurement.

Given the concerns raised with us, we approached BEIS as the Department conducting the consultation. We wanted to understand the reasons for this question. They advised us that UK law currently requires metric units to be used for all trade purposes, with only limited exceptions. The purpose of the consultation was to identify how they can give greater choice to businesses and consumers over the units of measurement they use to buy and sell products. BEIS did also say that respondents had multiple opportunities to give a range of views through the consultation, and that all responses would be carefully considered.

This explanation was helpful. But it still doesn’t make this a good question design because it doesn’t offer a complete range of responses. For example including a ‘none of the above’ option would have allowed respondents the opportunity to express a preference for a unit of measurement other than imperial or imperial with metric. In the absence of such an option, the design in effect makes this an overly leading question.

So, what happens next? We would be surprised to see the evidence drawn from this specific question being used in any kind of formal quantitative way. If the results of this poorly designed question were presented in statistical terms (x% of people said…etc), then this would represent the generation of a statistical claim from a consultation. And in our view it would be potentially quite misleading as a representation of levels of support for any proposed policy change.

What is intelligent transparency and how you can help?

In our latest blog, Statistics Regulator Siobhan Tuohy-Smith discusses what we mean by intelligent transparency and how you can be an advocate for intelligent transparency across government and official data, statistics and wider analysis.

So what is intelligent transparency?

Everyone, I think, has a fairly clear idea of what transparency means. It means being open or clear – getting the data or statistics out there. But what do we mean when we talk about intelligent transparency?  

At its heart intelligent transparency is about proactively taking an open, clear and accessible approach to the release and use of data, statistics and wider analysis. As set out in our regulatory guidance on transparency, intelligent transparency is informed by three core principles: equality of access, enhancing understanding and analytical leadership. It’s about more than just getting the data out there. Intelligent transparency is about thinking about transparency from the outset of policy development, getting data and statistics out at the right time to support thinking and decisions on an issue, supporting the wider public need for information and presenting the data and statistics in a way that aids understanding and prevents misinterpretation. For example, the Welsh Government Chief Statistician posted a blog on understanding COVID-19 infection rates in Wales on 11 January 2022. This post went beyond just getting the data out there, by also aiding user understanding of the data to avoid misinterpretation. 

Why is transparency important?

For me, transparency is a key part of what we do at the Office for Statistics Regulation (OSR). It’s a theme the runs throughout the Code of Practice for Statistics, from practice Q2.3 about transparency about the methods used, to V2.1 about ensuring free and equal access to regular and ad hoc official statistics, to principle T3 about orderly release, to name but a few.  

Transparency is also a core part of ensuring data, statistics and wider analysis serve the public good. Only by seeing the numbers, and understanding where they came from, can we really understand what they mean and how to use them to best inform individual decisions and understanding of an issue. Individual decisions about where and when to buy a new house, mortgage decisions, and what school to send your child to. Or public understanding about COVID-19 infection rates or a new policy around climate change.   

We highlighted the need for intelligent transparency as a key theme in our recent State of the Statistics System report and it continues to recur as a theme in our casework.

What can you do to support intelligent transparency?

In OSR, we continue to champion intelligent transparency and equal access to data, statistics and wider analysis. We: 

  • Are building our evidence base, highlighting good examples and understanding more about barriers to transparency to help support those working across government to implement intelligent transparency. Today we have published some FAQs about intelligent transparency to help support this work. 
  • Engage with analysts, policy-makers and the communications function across government, and interested parties outside of government to advocate intelligent transparency and develop networks committed to intelligent transparency. 

But we recognise that this is not something we can do alone. We need your help! 

So what can you do: 

You can be an advocate for intelligent transparency across government and official data, statistics and wider analysis: 

  • As a user of this data, you can continue to question what you see and ask yourself does it make sense? Do you know where it comes from? Is it being used appropriately?  
  • If you are based in a department or a public body, you can champion intelligent transparency in your team, your department and your individual work. Build networks to promote our intelligent transparency guidance across all colleagues and senior leaders in your organisation. You can engage with users to understand what information it is they need to inform their work to inform the case for publishing it; get in touch with your Head of Profession or OSR if you experience issues publishing statistics, data or wider analysis of significant public value or interest 

You can get in touch with us via if you are interested in working with us on intelligent transparency. You can also keep up to date with our work via our newsletter.  

You can raise concerns with us via – our FAQs about how to raise an issue set out what to expect if you raise a concern with us.    

Looking ahead

We will continue to champion intelligent transparency and with your help, together, we can help intelligent transparency become the default for all government data, statistics and wider analysis.


Related Links:

Regulatory guidance for the transparent release and use of statistics

Intelligent Transparency FAQs