The state of the statistical system 2021/22

The State of the UK’s Statistical System 2021/22

Published:
18 July 2022
Last updated:
6 September 2022

Quality

Quality means that statistics fit their intended uses, are based on appropriate data and methods, and are not materially misleading. Quality requires skilled professional judgement about collecting, preparing, analysing and publishing statistics and data in ways that meet the needs of people who want to use the statistics.


Key Findings


Innovation in collection and production

During this year there have been innovations in how statistics are collected and produced, with new data sources and data collection methods being used and greater embedding of Reproducible Analytical Pipeline (RAP) principles

As we highlighted in last year’s report, in response to the pandemic and the requirements for more up to date information, new high frequency data collections were set up to understand the rapid developments and what was happening. Over the last year these have evolved and led to other innovations in how data are collected. For example, during the pandemic the Department for Education collected daily data from schools about attendance. Schools, and other education establishments, had to manually input the data into a form each day. The Department are now trialling extracting attendance data direct from schools’ management information systems on a daily basis to provide more timely insights into attendance. Prior to the pandemic attendance data was collected on a termly basis. The Department are providing attendance data reports via a dashboard for participating schools. Although the collection is voluntary, there has been a high take up amongst schools and academy trusts.

Another instance of innovation in data collection has been seen with the Census 2021, where a ‘digital first’ approach was undertaken encouraging households to fill out the census online. For England and Wales, the Census was designed, built and delivered in-house by ONS teams reflecting the efficacy and maturity of their technical capabilities. This approach has been considered a success with over 22 million census responses submitted online from February to June 2021 in England and Wales and a total response rate of 97% of households. Northern Ireland was also successful in its ‘digital first’ programme with 80.6% of addresses completing online and an overall return rate of 97.2%. Scotland saw almost 9 out of 10 returns submitted online with an overall household return rate of just under 90% (89.1 %). National Records of Scotland (NRS) had been looking for a higher return rate. NRS have drawn on independent expert advice. On the basis of this advice and their own judgement, NRS concluded that the combination of the national return rate (89.1%) with 30 Local Authorities (out of 32) achieving over 85 % return rates (19 of which were over 90%), meant that it could close the “collect” aspect of Scotland’s Census and can move on to subsequent phases: the census coverage survey, the use of administrative data and the statistical methods phase. We will be working with National Records Scotland over the next year as it progresses work, including with key partners across the UK statistical system, to deliver high quality census outputs which meet the needs of users across Scotland and the rest of the UK. Our role will be to reach a judgement on compliance with the Code of Practice for Statistics.

An important aspect of innovation is being open to alternative sources of evidence, and challenges from external users. Our review of population estimates and projections found that improvements were needed in methods, communication and embracing challenge. ONS has published updates on completed, ongoing and future work to address our recommendations. Our review specifically noted that ‘…we found that in some smaller cities that had a large student population, the population estimates did appear to be inconsistent with, and potentially higher than local evidence suggests’. Comparisons between Local Authority (LA) Census estimates and previous population estimates created using admin data or from mid-year population estimates indicated that there did appear to be an issue with some population estimates for smaller cities. ONS has committed to publishing regular progress updates against our findings.

In last year’s report we recommended that Reproducible Analytical Pipelines (RAP) should be the default across the statistical system. We are encouraged to see progress by some statistic producers in developing this approach. In 2021 the Northern Ireland Statistics and Research Agency set up a ‘Tech Lab’ designed to harness new technologies by bringing together skilled personnel to provide a centralised dedicated technical resource to the agency. Amongst the work undertaken by this ‘Tech Team’ is the automation of production processes using RAP principles.

The Analysis Function in ONS has recently published its RAP Strategy which aims to embed RAP as the default approach to analysis in government. It focuses on three goals (tools, capability and culture) and sets out how the vision will be achieved, including practical advice for producers, and how progress will be monitored.


What would we like to see

  • Producers being proactive in identifying new and innovative data sources that can improve the quality of statistics and better meet user needs
  • Producers automating more of their statistical productions using Reproducible Analytical Pipelines, in order to enhance the trustworthiness, quality and value of statistics.

Enhancing the quality of statistics

There has been greater use of technology and additional data sources, and statisticians working with others, both inside and outside the statistical system, to enhance the quality of statistics

Statistics producers have been actively looking at, and using, technology to improve the quality of the data underpinning statistics. An example of this is the National Data Quality Improvement Service (NDQIS) set up by the Home Office. To improve data quality for ‘flagged offences’ like knife crime, they used computer aided classification to determine if a knife was used, reducing the need for manual checking and also standardising what counts as a knife crime. This development has reduced the burden on police officers and information analysts and improved the accuracy of the statistics. We have published a blog championing this innovation.

In last year’s report we also highlighted how producers were using innovative data sets. We have seen this trend continue, with statistics producers working with each other linking new data sets to provide more accurate and robust estimates. A good example of this is the development of ONS’s international migration statistics. ONS has used administrative data from the Department for Work and Pensions Registration and Population Interaction Database (RAPID), alongside Home Office border data, to produce migration estimates. Obtaining and linking these datasets has required extensive collaboration enabling ONS to provide a richer source of information and improve the accuracy of the migration estimates.

There has also been a greater willingness to work with others outside of the statistical system. To ensure robust estimates from the 2021 Census in England and Wales, ONS is working with Local Authorities to quality assure the data, drawing on their local intelligence and comparators, such as council tax data in order to corroborate figures at the small and local area level.


What would we like to see

  • Wider use of technology, such as computer aided classification, to enhance the quality of statistics
  • Greater use of other data sources, and linked data, to improve data accuracy and add extra value to outputs.
  • Statisticians working with others who can help ensure the quality of statistics.

Communicating uncertainty to bring insight

To bring effective insight, it is important to communicate uncertainty around the estimates. Improvements are needed in this area

Communicating uncertainty is an important part of ensuring the appropriate use of statistics. This information can help ensure that users do not draw inappropriate inferences from the statistics. There is guidance on communicating quality, uncertainty and change to support producers to ensure uncertainty and sources of potential bias are clear. There are examples of where statistical producers have communicated uncertainty in a way that is clear and understandable to non-expert users. The Coronavirus Infection Survey produced by ONS is a good example of this, as it continues to use a variety of mechanisms to communicate the uncertainty around the estimates. This includes quoting confidence intervals and inclusion of them on charts, providing smoothed modelled estimates, and using language such as ‘that the trend is unclear’.

Not all statistical releases have been so clear about the level of uncertainty in the estimates and the sources of potential bias. We are planning to publish a report on communicating uncertainty later this year and will work with others in this area, such as the Winton Centre and the ONS Data Quality Hub, to provide further support and guidance to producers.


What would we like to see

  • Improved communication of the uncertainty within statistics.
  • Consideration of how uncertainty is communicated, using language that matches the complexity of the statistics presented in order to make products as accessible as possible.
Back to top
Download PDF version (2.65 MB)