Assessment Report: UK Business Demography Statistics

In 2020, we fully assessed ONS Business Demography statistics, reviewing their compliance against each of the pillars in the Code of Practice for Statistics.

Key Findings

Our view is that ONS should aim for its business demography statistics to be considered key economic indicators. But they are not regarded as such at the moment, because they are not as good or as useful as they should be.

The ONS’s business register – the Inter-Departmental Business Register (IDBR) – holds a wealth of data on the UK’s business population. Some of these are used to produce business demography statistics. The remainder of which, however, remains a largely untapped resource.

In response to the COVID pandemic, ONS in conjunction with Companies House, introduced a weekly indicator of business births and deaths. ONS has also published the first of a quarterly series of experimental business demography statistics, which draw on data from Companies House and the Insolvency Service. These significant innovations present a platform for further development of business demography statistics.

Some required improvements to the statistics rely on significant investment. It is clear that the redirection of funding away from the Statistical Business Register project has hindered ONS’s ambitions to enhance the contribution that the business register makes to economic statistics. Work to develop ONS’s business register should urgently be re-introduced to ensure that users’ needs for business population statistics are met.


In the short term (by the time the next annual statistics are published in November 2020) in order to retain the National Statistics status for these statistics, ONS:

  • must have demonstrated progress in understanding the access difficulties users are experiencing when using and linking IDBR data with data
  • should publish its plans for publishing more timely business demography statistics, and its plans for developing the recently introduced quarterly experimental statistics
  • should publish a narrative covering what ONS already knows about the range of key data quality issues, building on the supporting quality information provided with the new quarterly experimental statistics
  • should publish its plans to restart and resource work to develop its business register

In the longer term, ONS should publish a plan which includes specific actions, deliverables and a timetable by the end of January 2021, that explains how it will address the improvements identified in the report, including plans for reviewing the funding of the Statistical Business Register.

Exploring the public value of statistics about post-16 education and skills – UK report

We have been looking in detail at the value of the current data and statistics on post-16 education and skills. As an independent UK wide regulator, we are in a unique position to take a broader look at issues of importance to society and to make the case for improved statistics, across organisational and Government boundaries. 

This report, our second report in this topic area, explores the public value of post-16 education and skills statistics across the UK with a focus on Scotland, Wales and Northern Ireland and updates on changes since the publication of our first, England only, report in 2019. 

Four key sectors comprise the majority of the post 16 education and skills statistics in the UK: workforce skills, universities and higher education, colleges and further education and apprenticeships, and each are covered in detail in our report. To our knowledge, this is the first time that the statistics that inform these sectors have been extensively researched at a UK wide level.  

Exploring the statistical landscape in this multi sector, multi country way has allowed us, to not only to identify the current challenges, information gaps and improvements to statistics in each sector, but to also highlight areas of good practice and shared learning opportunities. We have looked in detail as to how the current statistics are meeting the needs of users, focusing on the public value that the statistics give. In doing this we have been also been able to explore in detail how accessible the current statistics are and whether theare helping to inform a bigger, sector wide, picture. 

Post-16 education and skills affect the lives of millions of individuals in the UK. Good quality and accessible statistics are important to support the fair, efficient and effective provision of education and training. Alongside this report we will continue to engage with statistics producers to make the case for improved data and statistics in these sectors 

The state of the UK’s statistical system

This review sets out our view on the current state of government statistics. At their best, statistics and data produced by government are insightful, coherent, and timely. They are of high policy-relevance and public interest. There are good examples of statistics that effectively support decision-making in many areas of everyday life: this has been especially true during the COVID-19 pandemic, when we’re seeing the kind of statistical system that we’ve always wanted to encourage – responsive, agile and focusing on users. However, the statistical system does not consistently perform at this level across all its work.

In this report we address eight key areas where improvements could be made across the system.

  1. Statistical leadership
  2. Voluntary Application of the Code, beyond official statistics
  3. Quality assurance of administrative data
  4. Communicating uncertainty
  5. Adopting new tools, methods and data sources
  6. Telling fuller stories with data
  7. Providing authoritative insight
  8. User engagement

In each area, we highlight examples of statistical producers doing things well. These examples illustrate the good work already happening which others can learn from and build on. We have organised our reflections under the three headings of Trustworthiness, Quality and Value, the three essential pillars that provide the framework for the Code of Practice for Statistics.

User engagement in the Defra Group

Why we did this review

Understanding how statistics are used and what users and other stakeholders need is critical to ensuring that statistics remain relevant and provide insight. To achieve this, statistics producers must engage with users.

To explore this aspect of statistics production, we carried out a review of user engagement in the Defra Group. By the Defra group we mean the Core Department and Executive Agencies, Forestry Commission and those Defra Arm’s Length bodies that are designated as producers of official statistics: Environment Agency, Joint Nature Conservation Committee, Marine Management Organisation and Natural England.

This is our first departmental review of user engagement and the Defra Group made an ideal candidate for such a review. It has a large and broad portfolio of official statistics and National Statistics, with a varied public profile, public interest and impact and is therefore likely to require different approaches to engaging with users.

We focused our review on a set of 10 National Statistics and official statistics which reflect the diversity of the Defra Group statistics portfolio (see report Annex B). They cover a range of topics, users and uses, and represent the Defra core department as well as Arm’s Length Bodies.

What we hope to achieve

Through this review we aim to develop a better understanding of the range of approaches to user engagement currently adopted within the Defra Group, and to identify the key features of effective and impactful user engagement. We hope this will support the Defra Group in enhancing its user engagement and provide broader learning for other statistics producers.

Related links:

Correspondence: Ed Humpherson to Ken Roy: User engagement in the Defra Group

Blog: What we have learned from the Defra Group about user engagement

Presenting estimates of R by government and allied bodies across the United Kingdom

During the coronavirus (COVID-19) pandemic there has been increasing focus on, and interest in, the reproduction number – R. R is the average number of secondary infections produced by 1 infected person.

OSR’s observation of recent presentations of R is that generally a good job is being made of explaining both the number itself and its implications for the UK and each of the devolved nations. However, there is room for estimates of R to be presented more clearly and explained more meaningfully. Lessons can be learnt from the approach to publication of R by different nations of the UK.

Decision-makers across the UK have made it clear that decisions about how we come out of lockdown and whether or not any restrictions need to be re-introduced in future are informed by the value of R.

The latest estimates of R have become widely quoted by scientists, government officials and the media.
R for the UK is estimated by a range of independent modelling groups based in universities and Public Health England (PHE). Scientific advisers and academic modellers compare different estimates of R from the models and collectively agree a range which R is very likely to be within.

Devolved nations tend to use either those same independent models or one preferred model and apply data about the pandemic in their own countries to arrive at their consensus estimates of R. All devolved nations are publishing or intend to publish estimates for the range of R in their different countries on a regular (most on a weekly) basis. We commend the cooperation taking place between the four nations to bring about a consistent approach to R and where it should be published.

We’ve been impressed that explanations have succeeded in conveying the importance of the R-number and the role the estimates play in advice to ministers. We particularly commend;

The accessibility of the statistics

  • Estimates of R sit within a crowded, and sometimes confusing, landscape of other data and we found that broadly the needs of different types of users and potential users have been taken into account in the presentation and release of the statistics and data.

The presentation of uncertainty

  • For example, presenting R as being within a range clearly demonstrates the uncertainty in the estimate. We particularly liked the presentation of uncertainty in the Welsh Government’s Technical Advisory Cell Monitoring document which uses a fan chart to show the uncertainty. The use of estimates to one decimal place is also commended as it also conveys the uncertainty of the estimates.

The narratives about the estimates of R

  • These are particularly helpful when they are simply worded, adopt visually engaging summaries with charts and infographics about the R-number, and are presented alongside data. An example of helpful referencing to source data is the Scottish Government’s presentation Coronavirus: Modelling the epidemic in Scotland: Issue 2
  • We see the value of these narratives as helping to make sense of the decisions about school closures, social distancing and other measures aimed at reducing the spread of the virus.

We expect that as more data becomes available and more knowledge is gained about the pandemic, there will naturally be improvements in the presentation of R. Our observations suggest that producers can improve the value and quality of their statistics about R by:

Adopting even clearer language and terminology to describe estimates of R

  • For example, describing the estimates of R as ‘a consensus value’ alongside a range is confusing without explaining what is meant by a ‘consensus value’. Producers need to be clear about messaging whether potentially small changes in ranges for the value of R are statistically different from previous week’s consensus.

Linking to clear and easily accessible supporting materials

  • Cited research should demonstrably support the evidence and ideas being put forward.

Clearly explaining the sensitivity of the models to key assumptions

  • Users of these statistics who are more analytical or who want more information about the data before they are confident in the analysis, may wish to understand the sensitivity of the estimates of R to key assumptions in the models.

We advise people, when speaking publicly or writing about R, adopt due accuracy and provide sufficient context to avoid misleading people. Key learning from the presentation of R for the UK and for devolved nations to date has been;

  • Be careful to help people see R in the context of other data for example alongside data on the number of people infected, and other relevant factors such declining or increasing infection rates.
  • Clearly communicate the extent and nature of any uncertainty in the estimates. For example, clearly state the uncertain nature of the estimates and avoiding talking about estimates as if they are fact. Also, there is a need for even greater caution when infection rates become very low.
  • Be clear that estimates of R come from modelled assumptions, which is why different models can yield different estimates. Good practice is, where possible, take account of the results from various models to discuss the range for the possible values of R.
  • Be aware that some groups access information on coronavirus through hearing the narrative about the latest alone and are unable to see slides or graphical information. This places a responsibility on commentators to be clear and accurate in what they say.

Strengthening the quality of HMRC’s official statistics

Introduction to the review

In September 2019, HMRC invited the Office for Statistics Regulation to carry out a review of the principles and processes underpinning the quality of HMRC’s official statistics. This review was proactively initiated after HMRC identified a significant error in published Corporation Tax receipt statistics, which affected the period from April 2011 to July 2019.

Aim and scope of the review

The aim of our review was to provide an independent assessment of the approach that HMRC takes to manage quality and risk in the production of its official statistics and to identify potential improvements. We appreciate that producers of statistics will never eliminate errors entirely: the recommendations we present in this report focus on improvements that HMRC should make to help minimise the risk of issues with its statistics in the future.

Related links:

Ed Humpherson to Ruth Stanier: Strengthening the quality of HMRC’s official statistics

Ed Humpherson to Jim Harra: Strengthening the quality of HMRC’s official statistics

Ed Humpherson to Sean Whellams: Review of HMRC statistical quality management

Jim Harra to Ed Humpherson

Ruth Stanier to Ed Humpherson 

COVID-19 surveillance and registered deaths data review

Information available on COVID-19 cases and deaths has been developed rapidly in a constantly shifting environment. The work being done by analysts to get this information into the public domain is commendable. There will always be a desire for improvements to the timeliness and completeness of data, but this should not undermine the huge efforts being made by individuals and organisations to deliver timely data to support decision making and inform the public.

Our vision is statistics that serve the public good. We aim to support producers of statistics and data to achieve this while championing the needs of the public. We have undertaken a short review of the data releases on COVID-19 cases and deaths – at a UK level and for each country within the UK – to help understanding of the available sources and to highlight strengths and our view on areas for improvement. This document outlines the findings from our review, that is necessarily only a snapshot of what are very fast-moving developments.

In reviewing the various statistical outputs, we have been guided by the three pillars of the Code of Practice for Statistics: Trustworthiness, Quality and Value. Trustworthiness refers the governance that surrounds the production of statistics; Quality refers to the characteristics of the data; and Value considers the extent to which the statistics answer users’ questions.

Summary of findings

There have been many developments to the data and supporting information available on COVID-19. Analysts have made huge efforts to deliver the information and have shown a willingness to address concerns and make rapid improvements.

There is great value in having timely data, such as the daily surveillance data covering the UK that is published less than 24 hours after the data reporting period. It provides an important leading indicator of the trend in COVID-19 testing, cases and deaths, which is essential to inform operational decisions being made at pace. However, the speed at which these data are made available means there has been a trade off with completeness, and the limitations of the UK data have not been fully explained.

The nature and extent of the uncertainty around the UK estimates of deaths associated with COVID-19 has not so far been made clear. However, we are aware of efforts being made to improve the clarity and transparency of the material that accompanies the daily briefing, including drawing on support from the Government Statistical Service (GSS).

In contrast, the weekly death statistics published for England and Wales, Scotland and Northern Ireland provide a more complete measure of the number of deaths associated with COVID-19, but these statistics are released with a greater time lag.

ONS’s publication of its forward workplans in this area is a helpful development for stakeholders and it is important that other nations provide detail about their plans to keep users of the statistics informed. We understand that the GSS is considering the accessibility of all the information on COVID-19 to allow users to navigate all outputs from a central hub, such as the GSS health and care statistics landscape.

Areas for further development

  1. It is important to maintain public confidence and trustworthiness of statistics that are used to inform public debate. The nature and extent of the uncertainty around the UK estimates of deaths associated with COVID-19 should be clarified.
  2. All statistics producers should show they are actively considering the diverse and changing user need for COVID-19 statistics, by publishing detailed plans for improvements, for example, information about the occupancy of intensive care units or beds, or on person characteristics, such as ethnicity.
  3. The GSS should consider the accessibility of the information and allow users to navigate all COVID-19 related outputs from a central hub, such as the GSS landscape.

National Statistics Designation Review – Phase 1 Exploratory Review

The Office for Statistics Regulation (OSR) has conducted an exploratory review to see whether the time is right to look at the meaning and value of the NS designation: does it meet the needs of official statistics in serving the public good in a data abundant world? And, if required, what further developments should be conducted?

This paper summarises the findings from the exploratory review in which we spoke to a range of stakeholders, to get an initial steer on the value and usefulness of the NS designation. It presents recommendations that OSR and official statistics producers can consider, to improve the information for users about the status of official statistics.

Please see our review page for further information about the National Statistics designation review.

Two-year update: Public Value of Statistics on Housing and Planning in the UK

Our systemic review of The Public Value of Statistics on Housing and Planning in the UK was published in November 2017. This comprehensive review looked across a wide range of the statistics within our Housing, Planning and Local Services regulatory area.

This two-year update report shares the progress made since the review, highlights the challenges that remain and outlines our proposed work plan approach for this regulatory area.


Related Links

Public value of Statistics on Housing and Planning in the UK (November 2017)