Exploring the public value of statistics about post-16 education and skills – UK report

We have been looking in detail at the value of the current data and statistics on post-16 education and skills. As an independent UK wide regulator, we are in a unique position to take a broader look at issues of importance to society and to make the case for improved statistics, across organisational and Government boundaries. 

This report, our second report in this topic area, explores the public value of post-16 education and skills statistics across the UK with a focus on Scotland, Wales and Northern Ireland and updates on changes since the publication of our first, England only, report in 2019. 

Four key sectors comprise the majority of the post 16 education and skills statistics in the UK: workforce skills, universities and higher education, colleges and further education and apprenticeships, and each are covered in detail in our report. To our knowledge, this is the first time that the statistics that inform these sectors have been extensively researched at a UK wide level.  

Exploring the statistical landscape in this multi sector, multi country way has allowed us, to not only to identify the current challenges, information gaps and improvements to statistics in each sector, but to also highlight areas of good practice and shared learning opportunities. We have looked in detail as to how the current statistics are meeting the needs of users, focusing on the public value that the statistics give. In doing this we have been also been able to explore in detail how accessible the current statistics are and whether theare helping to inform a bigger, sector wide, picture. 

Post-16 education and skills affect the lives of millions of individuals in the UK. Good quality and accessible statistics are important to support the fair, efficient and effective provision of education and training. Alongside this report we will continue to engage with statistics producers to make the case for improved data and statistics in these sectors 

The state of the UK’s statistical system

This review sets out our view on the current state of government statistics. At their best, statistics and data produced by government are insightful, coherent, and timely. They are of high policy-relevance and public interest. There are good examples of statistics that effectively support decision-making in many areas of everyday life: this has been especially true during the COVID-19 pandemic, when we’re seeing the kind of statistical system that we’ve always wanted to encourage – responsive, agile and focusing on users. However, the statistical system does not consistently perform at this level across all its work.

In this report we address eight key areas where improvements could be made across the system.

  1. Statistical leadership
  2. Voluntary Application of the Code, beyond official statistics
  3. Quality assurance of administrative data
  4. Communicating uncertainty
  5. Adopting new tools, methods and data sources
  6. Telling fuller stories with data
  7. Providing authoritative insight
  8. User engagement

In each area, we highlight examples of statistical producers doing things well. These examples illustrate the good work already happening which others can learn from and build on. We have organised our reflections under the three headings of Trustworthiness, Quality and Value, the three essential pillars that provide the framework for the Code of Practice for Statistics.

User engagement in the Defra Group

Why we did this review

Understanding how statistics are used and what users and other stakeholders need is critical to ensuring that statistics remain relevant and provide insight. To achieve this, statistics producers must engage with users.

To explore this aspect of statistics production, we carried out a review of user engagement in the Defra Group. This is our first departmental review of user engagement and the Defra Group made an ideal candidate for such a review. It has a large and broad portfolio of official statistics and National Statistics, with a varied public profile, public interest and impact and is therefore likely to require different approaches to engaging with users.

What we hope to achieve

Through this review we aim to develop a better understanding of the range of approaches to user engagement currently adopted within the Defra Group, and to identify the key features of effective and impactful user engagement. We hope this will support the Defra Group in enhancing its user engagement and provide broader learning for other statistics producers.

By the Defra group we mean the Core Department and Executive Agencies, Forestry Commission and those Defra Arm’s Length bodies that are designated as producers of official statistics: Environment Agency, Joint Nature Conservation Committee, Marine Management Organisation and Natural England.

Related links:

Correspondence: Ed Humpherson to Ken Roy: User engagement in the Defra Group

Blog: What we have learned from the Defra Group about user engagement

Presenting estimates of R by government and allied bodies across the United Kingdom

During the coronavirus (COVID-19) pandemic there has been increasing focus on, and interest in, the reproduction number – R. R is the average number of secondary infections produced by 1 infected person.

OSR’s observation of recent presentations of R is that generally a good job is being made of explaining both the number itself and its implications for the UK and each of the devolved nations. However, there is room for estimates of R to be presented more clearly and explained more meaningfully. Lessons can be learnt from the approach to publication of R by different nations of the UK.

Decision-makers across the UK have made it clear that decisions about how we come out of lockdown and whether or not any restrictions need to be re-introduced in future are informed by the value of R.

The latest estimates of R have become widely quoted by scientists, government officials and the media.
R for the UK is estimated by a range of independent modelling groups based in universities and Public Health England (PHE). Scientific advisers and academic modellers compare different estimates of R from the models and collectively agree a range which R is very likely to be within.

Devolved nations tend to use either those same independent models or one preferred model and apply data about the pandemic in their own countries to arrive at their consensus estimates of R. All devolved nations are publishing or intend to publish estimates for the range of R in their different countries on a regular (most on a weekly) basis. We commend the cooperation taking place between the four nations to bring about a consistent approach to R and where it should be published.

We’ve been impressed that explanations have succeeded in conveying the importance of the R-number and the role the estimates play in advice to ministers. We particularly commend;

The accessibility of the statistics

  • Estimates of R sit within a crowded, and sometimes confusing, landscape of other data and we found that broadly the needs of different types of users and potential users have been taken into account in the presentation and release of the statistics and data.

The presentation of uncertainty

  • For example, presenting R as being within a range clearly demonstrates the uncertainty in the estimate. We particularly liked the presentation of uncertainty in the Welsh Government’s Technical Advisory Cell Monitoring document which uses a fan chart to show the uncertainty. The use of estimates to one decimal place is also commended as it also conveys the uncertainty of the estimates.

The narratives about the estimates of R

  • These are particularly helpful when they are simply worded, adopt visually engaging summaries with charts and infographics about the R-number, and are presented alongside data. An example of helpful referencing to source data is the Scottish Government’s presentation Coronavirus: Modelling the epidemic in Scotland: Issue 2
  • We see the value of these narratives as helping to make sense of the decisions about school closures, social distancing and other measures aimed at reducing the spread of the virus.

We expect that as more data becomes available and more knowledge is gained about the pandemic, there will naturally be improvements in the presentation of R. Our observations suggest that producers can improve the value and quality of their statistics about R by:

Adopting even clearer language and terminology to describe estimates of R

  • For example, describing the estimates of R as ‘a consensus value’ alongside a range is confusing without explaining what is meant by a ‘consensus value’. Producers need to be clear about messaging whether potentially small changes in ranges for the value of R are statistically different from previous week’s consensus.

Linking to clear and easily accessible supporting materials

  • Cited research should demonstrably support the evidence and ideas being put forward.

Clearly explaining the sensitivity of the models to key assumptions

  • Users of these statistics who are more analytical or who want more information about the data before they are confident in the analysis, may wish to understand the sensitivity of the estimates of R to key assumptions in the models.

We advise people, when speaking publicly or writing about R, adopt due accuracy and provide sufficient context to avoid misleading people. Key learning from the presentation of R for the UK and for devolved nations to date has been;

  • Be careful to help people see R in the context of other data for example alongside data on the number of people infected, and other relevant factors such declining or increasing infection rates.
  • Clearly communicate the extent and nature of any uncertainty in the estimates. For example, clearly state the uncertain nature of the estimates and avoiding talking about estimates as if they are fact. Also, there is a need for even greater caution when infection rates become very low.
  • Be clear that estimates of R come from modelled assumptions, which is why different models can yield different estimates. Good practice is, where possible, take account of the results from various models to discuss the range for the possible values of R.
  • Be aware that some groups access information on coronavirus through hearing the narrative about the latest alone and are unable to see slides or graphical information. This places a responsibility on commentators to be clear and accurate in what they say.

Strengthening the quality of HMRC’s official statistics

Introduction to the review

In September 2019, HMRC invited the Office for Statistics Regulation to carry out a review of the principles and processes underpinning the quality of HMRC’s official statistics. This review was proactively initiated after HMRC identified a significant error in published Corporation Tax receipt statistics, which affected the period from April 2011 to July 2019.

Aim and scope of the review

The aim of our review was to provide an independent assessment of the approach that HMRC takes to manage quality and risk in the production of its official statistics and to identify potential improvements. We appreciate that producers of statistics will never eliminate errors entirely: the recommendations we present in this report focus on improvements that HMRC should make to help minimise the risk of issues with its statistics in the future.

Related links:

Ed Humpherson to Ruth Stanier: Strengthening the quality of HMRC’s official statistics

Ed Humpherson to Jim Harra: Strengthening the quality of HMRC’s official statistics

Ed Humpherson to Sean Whellams: Review of HMRC statistical quality management

Jim Harra to Ed Humpherson

Ruth Stanier to Ed Humpherson

COVID-19 surveillance and registered deaths data review

Information available on COVID-19 cases and deaths has been developed rapidly in a constantly shifting environment. The work being done by analysts to get this information into the public domain is commendable. There will always be a desire for improvements to the timeliness and completeness of data, but this should not undermine the huge efforts being made by individuals and organisations to deliver timely data to support decision making and inform the public.

Our vision is statistics that serve the public good. We aim to support producers of statistics and data to achieve this while championing the needs of the public. We have undertaken a short review of the data releases on COVID-19 cases and deaths – at a UK level and for each country within the UK – to help understanding of the available sources and to highlight strengths and our view on areas for improvement. This document outlines the findings from our review, that is necessarily only a snapshot of what are very fast-moving developments.

In reviewing the various statistical outputs, we have been guided by the three pillars of the Code of Practice for Statistics: Trustworthiness, Quality and Value. Trustworthiness refers the governance that surrounds the production of statistics; Quality refers to the characteristics of the data; and Value considers the extent to which the statistics answer users’ questions.

Summary of findings

There have been many developments to the data and supporting information available on COVID-19. Analysts have made huge efforts to deliver the information and have shown a willingness to address concerns and make rapid improvements.

There is great value in having timely data, such as the daily surveillance data covering the UK that is published less than 24 hours after the data reporting period. It provides an important leading indicator of the trend in COVID-19 testing, cases and deaths, which is essential to inform operational decisions being made at pace. However, the speed at which these data are made available means there has been a trade off with completeness, and the limitations of the UK data have not been fully explained.

The nature and extent of the uncertainty around the UK estimates of deaths associated with COVID-19 has not so far been made clear. However, we are aware of efforts being made to improve the clarity and transparency of the material that accompanies the daily briefing, including drawing on support from the Government Statistical Service (GSS).

In contrast, the weekly death statistics published for England and Wales, Scotland and Northern Ireland provide a more complete measure of the number of deaths associated with COVID-19, but these statistics are released with a greater time lag.

ONS’s publication of its forward workplans in this area is a helpful development for stakeholders and it is important that other nations provide detail about their plans to keep users of the statistics informed. We understand that the GSS is considering the accessibility of all the information on COVID-19 to allow users to navigate all outputs from a central hub, such as the GSS health and care statistics landscape.

Areas for further development

  1. It is important to maintain public confidence and trustworthiness of statistics that are used to inform public debate. The nature and extent of the uncertainty around the UK estimates of deaths associated with COVID-19 should be clarified.
  2. All statistics producers should show they are actively considering the diverse and changing user need for COVID-19 statistics, by publishing detailed plans for improvements, for example, information about the occupancy of intensive care units or beds, or on person characteristics, such as ethnicity.
  3. The GSS should consider the accessibility of the information and allow users to navigate all COVID-19 related outputs from a central hub, such as the GSS landscape.

National Statistics Designation Review – Phase 1 Exploratory Review

This review undertaken by Office for Statistics Regulation (OSR) and the GSS Good Practice Team explores how National Statistics designation is understood, particularly outside the official statistics system.

‘National Statistics’ are the most important official statistics that have been demonstrated to meet the very highest standards of trustworthiness, quality and value, set out in the Code of Practice for Statistics. Only they carry the unique logo of the National Statistics tick mark, having been designated by OSR, as the regulatory arm of the UK Statistics Authority.

We are seeking to engage a wide range of stakeholders, including through a roundtable event and general public focus group. We will also engage with a range of user and producer groups through the autumn. If you are keen to be involved, please let us know.

For further information or to register your interest, please contact Penny Babb at regulation@statistics.gov.uk.

Two-year update: Public Value of Statistics on Housing and Planning in the UK

Our systemic review of The Public Value of Statistics on Housing and Planning in the UK was published in November 2017. This comprehensive review looked across a wide range of the statistics within our Housing, Planning and Local Services regulatory area.

This two-year update report shares the progress made since the review, highlights the challenges that remain and outlines our proposed work plan approach for this regulatory area.


Related Links

Public value of Statistics on Housing and Planning in the UK (November 2017)

Systemic Review Outline: Mental Health Statistics


Many people in the UK will experience a mental health problem each year, and most of us know at least one person who has been affected by a mental health condition at some point in their life. Every week there are reports in the media covering many aspects of mental health. Health policies attempting to address these many and varied issues have been introduced in all four countries across the UK in recent years. This review aims to look across the spectrum of mental health statistics and explore whether the statistical system is providing the information required to support individuals and policy makers.

What we plan to do and why

This project is intended to explore the extent to which statistics on mental health are able to answer the key questions that surround this area in society today. We want to understand users’ needs in relation to mental health statistics, and examine whether the statistics that are currently available meet these needs.

We want to see producers in all parts of the UK improve the trustworthiness, quality and value of their statistics on mental health so that they achieve parity with statistics about physical health, NHS performance and outputs. This is detailed in our work programme for 2019/20, which we published recently following engagement with both producers and users of statistics.

How we will do it

As each of the UK’s countries has separate policies on mental health, the review will take a phased approach, looking at the statistics produced in each country in turn. We have begun with England as it is the largest in terms of the volume and complexity of mental health statistics that are published.

The scope of mental health statistics is broad. As part of the initial scoping phase for each country we will determine the specific focus of the review. Our scoping work for England is now complete, and we are initially focusing on statistics on people’s experiences of mental health. This could include the incidence and prevalence of mental health conditions, individuals’ experiences of their mental health conditions, people’s access to services and journeys between different providers of services, and the outcomes of interventions.

Stage 1: system mapping

We have built up a picture of mental health statistics in England to help inform our understanding of what statistics relating to people’s experiences of mental health exist. To help with this we held meetings with some of the key producers of mental health statistics in England.

Stage 2: stakeholder engagement

Following on from stage 1, we are consulting with a range of statistics users who have an interest in mental health statistics in England. This work will be completed by early 2020.

Stage 3: collate findings and explore options for improvements

We will review the evidence gathered for the England part of the review and present it to key stakeholders. For areas where we feel improvements are needed, we will work collectively with relevant parties to identify solutions. We aim to publish our initial findings from this first part of the review by summer 2020.

Taking part

If you use mental health statistics or would like to be able to use them either personally or professionally, then we would like to hear from you. Some examples of questions that you might be interested in are:

  • Are the mental health statistics that you need available?
  • Are the statistics you need easy to find?
  • Are the statisticians responsive to your needs?
  • Do I have the information I need to better plan mental health services at a local level?
  • Is young people’s mental health getting worse?

You may have many other interesting and varied feedback so please get in touch with us at the regulation team. Our initial focus is on speaking to people who want to use statistics about the outcomes of mental health care in England.