Assessment Report: Statistics from the Annual Business Survey

Published:
6 September 2021
Last updated:
6 September 2021

Executive Summary

Why we carried out this assessment

The Annual Business Survey (ABS) is the largest business survey conducted by ONS in terms of the combined number of respondents and variables it covers. ONS collects data from 62,000 businesses in Great Britain, which is supplemented by data collected by the Northern Ireland Statistics and Research Agency (NISRA) through its Annual Business Inquiry (NIABI) on 11,000 businesses in Northern Ireland. It is the main resource for understanding the detailed structure and performance of businesses across the UK and is a key contributor of business information to the UK National Accounts.

OSR has recently conducted several assessments of ONS’s annual economic outputs: the Annual Purchases Survey, Business Demography Statistics, the Living Costs and Food Survey (LCF) and now the ABS. Across the OSR’s recent assessments, a common theme has emerged to explain the decline in the quality and relevance of ONS business data. A lack of investment, across finance, staff and systems, has meant that ONS has not been able to adequately address data quality and coverage issues, nor develop long-term business data in order to reflect the UK’s evolving industrial mix and the evolution of the digital landscape.

Our review aimed to identify opportunities, and to build on ONS’s initial good ideas to improve the quality and public value of ABS data. To inform our review, we spoke to a range of statistics producers and users who make use of ABS data to understand how the data are currently used, and identify opportunities to improve their public value.

What we found

ONS publishes provisional national data from the ABS on its website in November covering the previous calendar year. Revised national and regional results are then published in May of the following year, around 17 months after the end of the reference period. Whilst ABS data are used extensively in the measurement of long-run economic concepts such as productivity, the significant time delay on the publication of the data means they are not used to the same extent to measure the ongoing impacts of structural and cyclical changes to the UK economy. As a result, ABS data are not meeting users’ needs for timely and detailed data on business performance and users are turning to alternative data sources to gain an understanding of UK business activity. A number of government departments are using business data available on the Fame database, provided by Bureau Van Dijk (BVD), in order to better understand the structure of the UK economy.

Basing annual structural business statistics on alternative data sources (administrative, commercial, or business data already held by other government departments) supplemented with survey data collected more effectively has the potential to improve the quality and granularity of annual business data and reduce the burden on reporting businesses. They could also help to improve the timeliness of the data and render them more useful for analysing business trends and tracking the impact of policy changes. Although moving towards greater use of administrative data sources was part of ONS’s five-year strategy, little progress has been made to date.

The ABS statistical team has recognised the increased demand for more-detailed data and is pulling together ambitious plans, not only to examine the availability of new data sources, but to examine the possibilities for introducing flexibility into parts of the ABS to allow for additional questions to address current business policy issues. However, the ABS production system does not currently allow for the addition of new data sources, nor the linking of other survey data within its infrastructure. This has effectively restricted efforts to improve the breadth and depth of ABS data and its timeliness.

Throughout the COVID-19 pandemic, ONS was successful in meeting the short-term needs of users through the production of real-time information on business activity. It now needs to use this same agility, innovation, and investment to improve its longer-term measures of economic performance.

ONS’s user engagement focuses on the needs of government users. A similar level of engagement needs to be established with users from the private and third sectors, and academia. We welcome that ONS has organised a conference on business data for later in September 2021.

OSR’s Assessment Report of UK Business Demography Statistics noted several areas of uncertainty around the quality of business data held on the Inter Departmental Business Register (IDBR). These included concerns around the incorrect classification of businesses in relation to both industry and structure. The IDBR is used as the sampling frame for the ABS and, as such, some of the same concerns identified in the Business Demography report also apply to ABS data.

Requirements and next steps

We have identified several ways the ABS needs to be improved to meet users’ needs and to comply with the highest standards of the Code.

Requirement 1

ONS needs to urgently prioritise investment in the development of its structural economic statistics, to ensure that the public good is served by them, and in particular to ensure that:

The granularity and timeliness of the statistics are improved, and

The decline in the quality of official economic and business data highlighted in the Bean review of economic statistics does not continue and is reversed.

Requirement 2

ONS should continue, and develop, its current endeavours to establish what existing data – from administrative, private sector or other sources – could be used as the basis for its annual business statistics. These data have the opportunity to provide much more detail, more quickly and cheaply, with lower respondent burden than running a full survey. ONS should then determine what form of supplementary survey needs to be run to collect those data that are not already available from elsewhere (for example sub-national information), and develop ways to harness existing business data (for example use of accountancy software) or collect data more efficiently (for example through electronic questionnaires).

Requirement 3

ONS must develop its understanding of the potential uses and value of ABS data, by engaging better with users both inside and outside of government, to ensure that as far as possible its annual business statistics are providing public value. ONS needs to understand what users require from ABS data and demonstrate how it is going to use feedback to inform a development plan for its structural economic statistics. ONS should reflect on the Government Statistical Service’s User Engagement Strategy for Statistics to assist in selecting the most appropriate methods for engaging with users.

Requirement 4

ONS should provide clearer indications of the fitness for purpose of the ABS statistics and their strengths and limitations with respect to various potential uses. As part of this, ONS needs to understand and communicate to users the influence of IDBR data quality issues on ABS estimates, to ensure that users are well placed to understand the capability and usability of ABS estimates.

Requirement 5

ONS should work more urgently with the UK Data Service to improve the access of researchers to ABS microdata. Improving access to this was also a Requirement (1b) of OSR’s October 2020 Business Demography Assessment Report.

In addressing these requirements, ONS should seek to apply the same innovative and agile approaches it demonstrated in meeting the demand for short-term business data during the pandemic.

We expect ONS to publish a plan by the end of October 2021, which includes specific actions, deliverables and a timetable that explains how it will address the improvements identified in this report. The UK Statistics Authority will take advice from OSR, based on ONS’s progress against this plan to decide whether continued use of the National Statistics designation is merited.

Back to top
Download PDF version (227.12 KB)

Assessment Report: Statistics on General Pharmaceutical Services in England

Published:
22 July 2021
Last updated:
22 July 2021

Executive Summary

Judgement on National Statistics Status

NHS Business Services Authority (NHSBSA), an arm’s length body of the Department of Health and Social Care (DHSC), is responsible for providing services to NHS organisations, NHS contractors and to patients and the public. These services include the processing and management of prescription items dispensed by community pharmacies in England. NHSBSA is a new official statistics producer that was included in The Official Statistics Order 2018.

NHSBSA published a suite of information titled General Pharmaceutical Services in England in November 2020 as National Statistics (referred to as GPhS in this report). The responsibility for publication of these statistics was transferred from NHS Digital following a public consultation, as part of wider plans to migrate a number of primary care medicines datasets to NHSBSA. NHS Digital and its predecessor bodies had published these statistics annually since 2005. This change of responsibility prompted a request for a series of assessments by the NHSBSA statistics team and demonstrates its commitment to complying with the Code of Practice for Statistics in producing official statistics. NHSBSA’s statistics on Prescription Costs Analysis in England were assessed in 2020, marking the start of NHSBSA’s transition to an official statistics producer. The GPhS statistics are the second set of NHSBSA’s statistics to be assessed.

These statistics are important because they enable users to answer a number of key questions related to community pharmacies and appliance contractors in England, and the activities that these carry out. Key users of the statistics include ministers and officials in government departments, policy advisors, commissioners, business analysts, and data analysis and business intelligence specialists.

We have identified four requirements for NHSBSA to address in order to ensure the high standards of public value, quality and trustworthiness associated with National Statistics designation are met. These are described in chapters one to three of this report. Once NHSBSA has demonstrated that the improvements covered by these requirements have been made, OSR will recommend to the UK Statistics Authority that National Statistics status for these statistics be confirmed. It would also be helpful for the NHSBSA statistics team to publish an action plan outlining the steps that it will take to address the requirements.

Key findings

Public Value

NHSBSA has proactively sought to engage with users of the GPhS statistics since taking over responsibility for their publication. NHSBSA outlines several activities in its user engagement strategy that it carries out to continue to identify existing and potential users of the statistics. This proactive approach to user engagement means that NHSBSA now has a small collection of known users that it can engage actively with. Users that we spoke found the statistics relevant to their needs.

The key findings are clearly stated at the start of the bulletin, and the main report is simple and clearly laid out. Currently the commentary is limited to descriptions of changes over time, and there are limited explanations to help users in interpreting some of the figures. For the forthcoming publication, NHSBSA is planning to report all activity carried out by contractors linked to England’s response to the COVID-19 pandemic. To provide additional insight for users, we encourage NHSBSA to include more context on the pharmacy landscape in England, and information within the text to explain more technical charts.

Users felt that Table 1 in the supporting summary tables was too long and contained too many different types of information. This table presents a wide range of information relating to pharmacy and appliance contractor activity. To address this, NHSBSA should review the content and presentation of the supporting summary tables and improve them to make them more user friendly and easier to extract data from.

Quality

NHSBSA has improved some of the methods since taking the statistics over from NHS Digital. It has explained these changes and the rationale for them clearly in the background information and methodology note. However, some users told us they do not access this. To provide clarity on the changes for these users, NHSBSA should provide a brief explanation of them in the main report, with links to the background information and methodology note for those users who require more-detailed information.

The statistics are based on data extracted from administrative or management systems, with the majority being drawn from NHSBSA’s own systems. This approach is suitable for these statistics in that it provides maximum coverage while at the same time minimising costs. There are, however, potential sources of error that could impact on the accuracy of the figures. NHSBSA should make clear that there is a degree of uncertainty in the figures, to ensure that users do not wrongly conclude that they are precise.

A separate team in NHSBSA carries out a variety of assurance checks as part of the prescriptions processing procedure. However, errors identified as part of this procedure are not always corrected in the data and the statistical team at NHSBSA has not yet investigated the impact of this on the accuracy of the statistics. We encourage NHSBSA to explore the impact of these checks on the accuracy of the statistics and either make any adjustments necessary or communicate clearly to users the accuracy of the statistics and the sources of any uncertainty.

Trustworthiness

As NHSBSA is a new producer of official statistics, the interim lead official has taken steps to establish good links with the Head of Profession for Statistics at the Department of Health and Social Care (DHSC). He told us that he feels professionally supported in his role and able to approach DHSC for ad hoc advice.

NHSBSA has published a general statistical workplan and currently has several separate development plans relating to separate strands of these statistics. To enhance transparency, it has committed to compiling these individual plans into one over-arching development plan for these statistics, and publishing this plan on its website.

 

Next steps

NHSBSA is aiming to meet the requirements of this report by its next annual publication of the statistics in November 2021. Once NHSBSA reports back to OSR on how it has met the requirements within this report, the UK Statistics Authority will take advice from OSR and decide whether to confirm the National Statistics designation.

Back to top

Assessment Report: The Living Costs and Food Survey

Published:
7 July 2021
Last updated:
7 July 2021

Summary

Why we carried out this review

A household expenditure survey has been conducted each year in the UK since 1957. Since 2008, this has been in the form of the Living Costs and Food Survey (LCF), which collects information on spending patterns and the cost of living that reflect household budgets. It is the most significant survey on household spending in the UK and provides essential information for key social and economic measures including price indices. LCF data are published annually by ONS in Family Spending in the UK, and are also used in the production of other statistical series such as the Retail Prices Index (RPI).

In recent years, there have been several errors in LCF data which have led to errors in the RPI in both 2019 and 2020 through the use of incorrect expenditure weights. In 2020 for example, if the corrected LCF dataset had been used, it would have led to an upward revision of 0.1 percentage points to the published RPI annual growth rate for six months of 2020. In line with its policy not to make revisions, ONS did not revise the time series for the RPI. The errors in LCF data also affected the Office for National Statistics’ (ONS) Family Spending statistics for years 2017-18 and 2018-19, as they are the main output of LCF data.

In line with the Office for Statistics Regulation (OSR) and ONS strategies, we have carried out an assessment of the LCF against the Code of Practice for Statistics. Our review aims to identify opportunities to improve the quality and public value of LCF data. To inform our review, we spoke to a range of statistics producers and users who make use of LCF data, to understand the impact of LCF on other statistics and data.

 

What we found

The LCF data processing system is not fit for purpose. The system is unstable, often producing inconsistent results between processing runs of data. The statistics team has endured challenges with the system and resource it is working with to run the LCF, and it has done remarkably well to keep the LCF afloat in spite of these challenges.

We found that the LCF is highly valued by users and is seen as unique in bringing together data on spending habits with information on the households who are doing the spending. Users we spoke to do not see an alternative approach to collecting this information in the short term that does not involve a survey in some form. The production of the LCF is supported by a steering group made up of statistical teams in ONS and other government departments who use LCF data, as well as external think tanks. The LCF team would like to enhance its engagement with internal and external users, across academia, the private sector and other government departments, to ensure developments best meet user needs.

The COVID-19 pandemic has seen the landscape of user need change with increased demand for timely and granular data. The LCF has been underperforming in terms of sample size and response rate in recent years and this was highlighted as the main limitation of the LCF by users we spoke to. Without an increase to the sample size, the devolved administrations are unable to make use of many of the categories available in the LCF data.

The quality assurance arrangements that the LCF team has in place are generally appropriate for the data and go some way to reduce the risk of errors associated with the processing systems. However, the sample size of LCF can make it difficult to determine whether changes between periods are genuine or not. We heard from users of LCF data in ONS that widening access to the LCF data for the purposes of quality assurance could support the LCF statistics team in identifying errors and would allow it to focus on the pre-processing stages.

In 2016, a National Statistics Quality Review (NSQR) was carried out for the LCF. The NSQR recommended several alternatives that could be explored to improve the response rate and thus the achieved sample size of the LCF. The team has taken forward most of the recommendations with no additional headcount being allocated to progress them. The remaining recommendations remain relevant to improving LCF.

The lack of progress in the use of alternative and administrative data sources in the UK has impacted on the quality, accuracy, and international comparability of the LCF data, a perspective which is seen as important to users to gauge the impact of Brexit and the pandemic on UK households.

 

Requirements and next steps

We have identified several ways the LCF needs to be improved to meet users’ needs and to comply with the highest standards of the Code. Urgently fulfilling the requirements of this assessment is necessary to ensure that the LCF and its outputs continue to be fit for purpose. In order to retain the National Statistics status for Family Spending in the UK, we require ONS to:

a) Demonstrate a positive direction of travel by making some short-term gains by the end of 2021 as follows:

  1. ONS needs to take remedial action to improve the stability of the existing LCF processing system and to develop a new system which meets the needs of users and the staff running the systems.
  2. ONS should enhance its understanding of the value of the statistics by improving its engagement with users, within and outside ONS, to capture a wide range of views and use these to drive its priorities for development. ONS should reflect on the Government Statistical Service’s User Engagement Strategy for Statistics to help determine the best methods for engaging with users.
  3. ONS should provide a mechanism and relevant access for other teams in ONS who make use of LCF data to be able to contribute to the quality assurance of the data.

b) Publish a plan which includes specific actions, deliverables and a timetable by the end of March 2022, that explains how it will address the following strategic improvements:

  1. ONS needs to develop a solution to address user need for more-granular breakdowns of data, so that the devolved administrations and other key users can use the statistics in the ways that they need to for the public good.
  2. ONS needs to invest time and resource to pursuing initiatives to improve the quality and robustness of LCF data. ONS should be open to creative solutions to improve the response rate, such as continuing exploring the use of different short and long form questionnaires/diary, alternative sampling strategies and linking with other data sources, rather than focusing only on increasing the existing sample.

We have also highlighted several considerations for ONS to reflect on as part of its LCF and RPI improvement project.

a) ONS should consider extending the scope of its project work to include input from some of its key external users, such as those in the devolved administrations, where additional intelligence could be gathered on the use and issues faced by the government in its use of LCF data.

b) ONS should consider the management of risks throughout the end-to-end production process as part of the LCF projects medium term work and ambitions.

c) ONS should determine a longer-term solution for the LCF which draws on a broader base of data, international best practice and wider transformation initiatives.

We expect ONS to report back to us every quarter, starting from the end of September 2021, demonstrating its progress against these requirements. We will review the National Statistics designation of Family Spending in the UK at each of these points.

Download PDF version (262.99 KB)

Assessment Report: Police Funding for England and Wales Statistics

Published:
3 June 2021
Last updated:
28 June 2021

Executive Summary

Judgement on National Statistics Status

Police Funding for England and Wales statistics provide a valuable source of information on an issue that has been at the centre of public debate on policing for a number of years. The statistics, which are produced annually by Home Office, are a response to a user need for clear information about police funding.

We have identified several ways in which Home Office should develop the statistics in order to meet the highest standards of the Code of Practice for Statistics and achieve National Statistics status. These are described in chapters one to three of this report. As we identified improvements during the course of the assessment, Home Office began addressing them. Home Office has committed to further actions, which it expects to be carried out before the next publication of the statistics in July 2021.

Once Home Office demonstrates that these steps have been taken, we will recommend that the UK Statistics Authority designate Police Funding in England and Wales as National Statistics.

 

Key Findings

Public Value

The level of insight and information provided by the police funding statistics are viewed positively by users, and for the users who regularly make use of them the statistics they are seen as very valuable. One of the main strengths of the Police Funding statistics is that they fill a gap in the available data on police funding; all the users we spoke to feel the statistics achieve this well.

The original OSR casework that led to the creation of these statistics focussed on real-terms increases in police funding over time, so it is a current weakness of the statistics that figures are not adjusted for inflation. One of our key requirements that Home Office is currently working to address is to publish real-terms changes so that users have greater insight into the effects of inflation on police funding.

The length of time series was also flagged as an area for improvement by users. Currently the time series only goes back to 2015. To better inform public debate, Home Office should provide data at an aggregate level going back to 2010. Users felt this would make them more comparable to statistics such as the Local authority revenue, expenditure and finance statistics produced by Ministry of Housing, Communities and Local Government.

 

Quality

The statistics are based on several financial data sources, which include the council tax uplift and Home Office Police Grant uplift. The statistics are broken down to police force level and reasons for variation in the distribution of funding at force level is highlighted where appropriate in the statistics. This helps users understand the local and national funding priorities more easily whilst protecting the privacy of any strategic police force investment, such as anti-terrorism funding.

The statistics currently lack associated information about data limitations and quality, and users told us that such information would be helpful. During the assessment process Home Office started developing a user guide to accompany the next publication. Home Office should apply our Quality Assurance of Administrative Data (QAAD) framework to understand the limitations of all stages of the data process and assure itself and users of data quality.

 

Trustworthiness

Home Office is a trusted and respected producer of official statistics. Consequently, many users look to its website for up-to-date statistics on policing and data on what the police do. The team at Home Office is well managed and has recently been restructured; as a result of this restructuring the police funding statistics now have a permanent home within the Policing Analytical unit. The team had already been investigating avenues to develop the statistics before starting the assessment and it has been engaging well throughout the assessment process.

However, users told us that a discussion about the processes that take place before the statistics are produced would increase the transparency of the statistics. Home Office plans to provide details of the key events that take place before the statistics are produced in the new proposed user guide, to enhance their transparency.

 

Next steps

To achieve National Statistics status, Home Office should meet the requirements set out in this report. Home Office is encouraged to:

a. develop an action plan to meet the requirements in Table 1,2 and 3

b. meet the requirements for the next annual release of the statistics in July 2021

 

UK Statistics Authority will take advice from OSR and decide whether to award the National Statistics designation.

 

Download PDF version (225.21 KB)

Assessment Report: UK Productivity statistics

Published:
17 May 2021
Last updated:
17 May 2021

Executive Summary

Judgement on National Statistics Status

This assessment report was produced during the COVID-19 pandemic which has seen dramatic shocks to hours worked and economic output in the UK, both of which are key components of productivity statistics. Assessing the scale and impacts of the shocks on the statistics when the usual sources of data may be impaired presents major challenges to statistics producers. ONS has continued to produce and develop these statistics despite significant resource pressures and restricted data access.

In this assessment, we make our judgements about these statistics using the experiences of users and our own research from both before and during the pandemic. Where relevant, we highlight the impact of COVID-19 on these statistics.

We identified actions for ONS to further enhance the public value and quality of its UK productivity statistics. Where ONS’s quarterly labour productivity statistics already carry National Statistics status, fulfilling the requirements of this assessment will ensure that the designation of the statistics as National Statistics can continue. In respect to statistics here that are currently official statistics, we will consider whether their designation as National Statistics is desirable following this assessment.

The scope for this assessment is ONS’s suite of UK Labour and Multi-factor Productivity statistics (published in Productivity Economic Commentary, quarterly flash estimates, accompanying data tables and microdata available through ONS’s Secure Research Service), and the International Comparisons of Labour Productivity (ICP) ONS’s estimates of Public Service Productivity (PSP) are not in the scope of this assessment. The Office for Statistics Regulation intends to review the PSP statistics as part of its regulatory work programme for 2021-22.

 

Key Findings

Quality

Labour inputs for productivity statistics are primarily measured as hours worked, but also as numbers of workers or numbers of jobs. All these inputs draw on data from the Labour Force Survey (LFS) to some degree. Prior to the COVID-19 pandemic, there were known issues about LFS response rates, which had been steadily declining. The pandemic has had a significant impact on the way that LFS data are collected, although ONS has instigated mitigating measures Separate to this assessment, OSR looked into issues raised from the Economic Statistics Centre of Excellence on 14 January 2021 about ONS labour market estimates. ONS acknowledges that it is a complicated landscape at the moment. It intends to continue to publish data from the best sources and be open about their relative strengths and limitations so users can make informed judgements on the labour market.

ONS has acknowledged the challenges of measuring the economy during the pandemic. In addition to the problems with measuring labour inputs there have also been difficulties in measuring the economic output of the economy. These difficulties vary according to the restrictions introduced to manage the health effects of the pandemic. ONS has warned that there may be revisions to its statistics as it manages within the restrictions. The quality of the productivity statistics may be impacted by these effects.

ONS conducts checks of data and is aware of the operational circumstances in which its data for the productivity statistics are produced and uses other data sources where possible to corroborate its findings. ONS is aware of any coverage issues and potential sources of bias in the data collection and supply process. Information about the strengths and limitations of the data in the productivity economic commentary tends to be general mainly in the interests of producing shorter bulletins, for example noting that the statistics may be subject toincreased uncertainty’ and ‘increased likelihood of larger revisions’. This does not provide any judgement on the scale of the increased uncertainty or advice to users about how it might change the reliance they place on the data. ONS should go further by being open about perceived changes to the quality of the statistics as part of the narrative of interest to users.

ONS multi-factor productivity (MFP) statistics are theoretically superior measures to labour productivity, as they take account of the effects of multiple known inputs – in ONS’s case, quality-adjusted labour input and capital services. A decline in multi-factor productivity growth has been found to be an important feature of the productivity puzzle. However, we found a reluctance among some users in the policy community to use MFP measures as they are difficult to explain to users of policy advice. ONS has helped to explain MFP in language that is accessible by publishing a simple guide to MFP. ONS should go further and establish what the barriers are to the adoption of MFP by the policy community and include in its development plans ways of encouraging the use of MFP amongst such users and ways of extending its outreach to that community.

In 2018, ONS suspended publication of its International Comparisons of Productivity (ICP) because its estimates of UK labour productivity were not internationally comparable. There are known biases in measuring labour inputs in the UK. ONS proposed in February 2021 a short-term solution to restart publication of ICP estimates and the last publication of these statistics was in April 2018. Following the UK’s exit from the European Union, users are acutely aware of the need to compare the UK’s performance internationally. ONS should accelerate its plans to collaborate with its international partners, working closely with the OECD and introduce a system that is flexible enough to allow each country to make full use of its own sources, whilst still enabling the production of high-quality estimates that are suitable for international comparisons. Users told us that they very much welcome ONS’s intention to start publishing estimates of ICP again.

There is a risk to the quality of the statistics due to the reliance on informal collaborative relationships with data suppliers inside ONS. As there is a regular inflow of new staff coming into the productivity statistics team and then moving on due to staff rotation policy there is a risk of a loss of continuity and corporate memory. While we think the risk has been well-managed and there is regular engagement between data compilers and the productivity statistics team as users, ONS should check the appropriate levels of assurance for these statistics and that its arrangements for quality assuring these statistics are sufficiently robust.

Public Value

Overall, we found that ONS’s statistics and data on productivity to be useful, easy to navigate and relevant. They support understanding of the UK economy and the economic welfare of its citizens. ONS produces a prodigious range of productivity outputs at both macro and micro-economic levels, using techniques such as growth accounting and behavioural economics. It also publishes interesting research articles on productivity.

One of the reasons for our assessment of these statistics is to look at the extent to which the statistics help answer important questions about the productivity puzzle or puzzles. We found that there is much debate about the nature of the productivity puzzle(s). We received some feedback that ONS appears to adhere to the view that productivity is a supply-side concept and demand-side factors are indirect and second order. We also spoke to users who see the puzzle as relating to a much longer-term malaise, whereas ONS focuses on the slowdown in productivity following the financial crisis. A recent survey of UK academic economists tackled the question of which rationales behind the productivity slowdown are the most powerful, without drawing absolute agreement across the profession. The two most important causes for the productivity slowdown amongst the economists surveyed were low-demand (including due to the 2008 Global Financial Crisis, austerity, and Brexit) and labour market factors. Investment in human capital (including education and job retraining) was considered the policy most likely to improve private sector productivity. ONS does not accept that there has been bias towards supply-side explanations in its expression of the growth puzzle. The productivity analysts stand by their view of the productivity puzzle because of weak growth since the 2008 Global Financial Crisis, which is particularly acute in the UK. Overall, while there are different theories around the drivers of poor productivity growth in the UK, we found that ONS has been open about the debate and has published widely on the different rationales for the slowdown in productivity. ONS is only one body contributing to the effort to understand the drivers of the UK’s productivity puzzles. The debate about the productivity puzzles is now moving on due to Brexit and the COVID-19 pandemic and we expect ONS will continue to discuss in articles and commentary the latest research about how the puzzles present in the UK.

The statistics could be even more useful if ONS’s user engagement was more effective, particularly in clearly linking development priorities to the feedback it receives though engagement. We found that while ONS was good before the pandemic at sharing its work outwardly, there is room for improvement in the way it openly takes on board feedback from different sources. It is good to see that ONS has recently launched a user survey which closed in April 2021.

In 2018, ONS set out ambitious priorities for developing these statistics but has found making significant progress against these more challenging than anticipated, due to both the onset of the pandemic and staff changes.ONS needs to make demonstrable progress against the priorities which it has signalled to users. It also needs to be more transparent around its progress towards meeting priorities and objectives, by providing an updated development plan at the earliest opportunity. ONS has told us it will provide this update in the summer of 2021. ONS should find a better balance between statistical production and the development of the statistics to meet users’ current and future needs and priorities.

ONS’s recent move to a productivity economic commentary to replace several separate bulletins is a positive step towards telling a clearer and more holistic story of movements in productivity. Flash estimates of labour productivity produced using the latest labour market statistics and the gross value added (GVA) first quarterly estimates are published separately to the productivity bulletin. The combination of the flash estimates as well as the productivity economic commentary allows users to see more-timely estimates and later more-detailed estimates, providing balanced reporting.

We welcome the potential offered by ONS’s relatively new Management and Expectations Survey to examine the impact of COVID-19 on the main components of productivity – inputs, outputs and prices.

Trustworthiness

We have found the people, systems, and processes within ONS that support the production of these statistics and data demonstrate a high degree of trustworthiness. Analysts are well-managed and impartial and skilled in what they do.

 

Next Steps

The deadline for ONS to report back to us is September 2021, where we will review the progress that the team has made in addressing the requirements set out in this report. ONS should by the end of June 2021, publish an action plan alongside the statistics on its website which sets out its proposals for addressing the assessment requirements.

Back to top
Download PDF version (378.03 KB)

Assessment of Personal Independence Payment Statistics for Northern Ireland (produced by the Department for Communities, Northern Ireland)

Published:
13 May 2021
Last updated:
14 May 2021

Executive Summary

Judgement on National Statistics Status

These statistics provide relevant and trusted information on Personal Independence Payment (PIP) in Northern Ireland (NI). They enable users to better understand PIP as a relatively new benefit and its role in social security reform.

In requesting this assessment, the statistics team at the Department for Communities (DfC) is demonstrating its commitment to produce PIP statistics that meet the standards required of National Statistics and the Code of Practice for Statistics. We have identified four actions for DfC to address in order to enhance the public value and quality of the NI PIP statistics and to achieve National Statistics status. These are described in chapters one to three of this report.

Once the statistics team demonstrates that these steps have been undertaken, OSR will recommend that the UK Statistics Authority designate the statistics as National Statistics.

Key Findings

Public Value

The PIP statistics are used by a range of users across the advice and charity sectors, as well as within DfC, which is interested in how PIP as a relatively new benefit is performing and its role in social security reform. The key user engagement activity is coordinated by the PIP operations team through a quarterly consultative forum, attended by advice and disability groups. However, we found that users would welcome direct engagement with the statistics team to ensure developments to the statistics are prioritised in line with user need.

The majority of users we spoke to highlighted that the Department for Work and Pensions’ (DWP) PIP statistics contain more breakdowns for Great Britain (GB) than are available for NI, even though the data are sourced from the same system. The statistics team needs to engage more actively with users to communicate its ambition to better align the statistics in NI with what is available for GB. Where data gaps can’t be addressed, this should be communicated clearly to users and reasonable alternatives should be explored.

We found that user interest has begun to shift away from reassessments of individuals who were previously receiving the Disability Living Allowance (DLA) onto PIP, towards understanding award reviews and how these may impact individuals’ entitlements. The statistics team should ensure the statistical bulletin and data tables draw out the relevant insights to help users understand where NI is in its journey of rolling out PIP.

Users we spoke to were positive about the length and presentation of the bulletin. However, we found that accessibility could be improved by better signposting between the bulletin and the supplementary tables as some of the detail can be missed if the underlying data tables are not viewed.

Quality

The statistics team maintains a good working relationship with the PIP operations team and engages with it regularly. The users we spoke to have no concerns with the quality or methodology of the statistics, as the data for the PIP statistics are sourced from the PIP Computer System which represents a 100% population of PIP claimants with a postcode in NI on the reference date.

The statistics team has carried out a self-assessment of its understanding of administrative data against our Quality Assurance of Administrative Data (QAAD) framework. The statistics team should bring out the relevant information from its QAAD self-assessment to expand the methodology and quality notes in the bulletin and tables, to support the appropriate use of the statistics.

The Professional Services Unit (PSU), in which the responsible statisticians for PIP sit, is currently exploring the use of Reproducible Analytical Pipelines (RAP) in the production of its statistics, with a view to improving the quality of the statistics by reducing the risk of human errors associated with its current software packages. Once RAP has been fully implemented, PSU anticipates that this will free up resources, which can be reallocated to address developments requested by users.

The statistics team told us that although it has access to the same data as DWP, there is not currently a process in place for sharing code needed to produce statistics which are comparable with DWP. The statistics team needs to build stronger links with DWP to ensure a common understanding of the quality and priorities of PIP statistics and of the PIP Computer System.

Trustworthiness

The statisticians working on PIP are well established and users we spoke to said the team is always helpful and knowledgeable when responding to their queries. DfC has a dedicated webpage about its statistics protocols and compliance with the Code of Practice for Statistics and users in DfC told us that pre-release access is controlled well and that the statistics team are not influenced by senior authority within the organisation.

The statistics team could enhance its trustworthiness by being open about its statistical development plans and its progress towards achieving these developments, even if the timescales are uncertain. Whilst PSU does not routinely publish a development plan, some users we spoke to felt that the lack of transparency around development priorities can lead to data gaps being perceived as DfC withholding information.

Next Steps

We expect the DfC statistics team to report back to us by September 2021 outlining the steps that it has taken to address the requirements. The UK Statistics Authority will take advice from OSR based on the evidence received and decide whether to award the National Statistics designation.

 

 

Back to top
Download PDF version (191.02 KB)

Assessment Report: Benefit statistics

Published:
18 November 2020
Last updated:
17 November 2020

Assessment Report: Statistics on Prescription Cost Analysis: England

Published:
5 November 2020
Last updated:
4 November 2020

Assessment Report: Estimates of Station Usage (produced by the Office of Rail and Road)

Published:
3 November 2020
Last updated:
26 March 2021

Assessment Report: UK Business Demography Statistics

Published:
28 October 2020
Last updated:
27 October 2020