Conclusions on Data Sources

This interim report is focused on data sources. However, issues cannot always be fully compartmentalised in this way, and the investigation of issues with data sources has identified other related issues that will require more systemic consideration during later stages of the review.

Some of the issues identified in previous OSR assessments, raised by stakeholders and acknowledged by ONS staff, are the same, or similar, to those identified nearly a decade ago in the Bean review. This indicates the continuing relevance of the Bean recommendations.

Three broad cross-cutting issues have dominated our discussions with external stakeholders and ONS staff and are widely reflected in previous OSR assessments.

Back to top

ONS must fully acknowledge and address declining data quality

ONS is engaged in ambitious statistical transformation programmes in a challenging context and has made good progress in some areas. This work has included developing and implementing new surveys which give a timelier (Business Impacts and Conditions Survey) and a more granular (Annual Survey of Good and Services) understanding of the economy. ONS has also made significant progress in moving its business survey collection online.

However, in some areas, ONS has faced significant challenges in consistently ensuring that data feeding into economic statistics are of high quality. Most notably, the long-term reduction in response rates to household surveys, including the Labour Force Survey, has severely impacted data quality. ONS has made progress in 2025 in securing improved response to the Labour Force Survey, but the long-term challenges have been associated with a decline in the quality and reliability of some of its key economic statistics.

Although response rates have been a key focus of this review, we would like more assurance that ONS has sufficient steps in place to regularly review and improve sample design and representativeness, bias, survey methodology, and imputation. ONS appears to be finding survey collection more challenging than some of its international peers (see section of main report on survey response rates, and Annex 4), suggesting an urgent need to modernise its collection approach and working practices. In social surveys, these challenges already pose a significant quality issue, while in business surveys, we find increasing levels of risk which, if left unaddressed, could significantly impact quality. ONS is working on a strategic response to these challenges.

Back to top

Making progress with administrative data is difficult

There have been some good examples of the use of administrative and big data sources in core economic statistics, including the introduction of VAT data into the National Accounts and rail and rental prices data into inflation statistics.

However, as a whole, progress in using administrative data from other government departments has been slow, reflecting in part practical and cultural challenges across government.

Our Data Sharing and Linkage for the Public Good Follow-Up Report finds that these issues around data sharing are systemic and notes that there continues to be a failure to deliver on data sharing and linkage across government, alongside many persisting barriers to progress. It calls for stronger commitments to prioritise data sharing and linkage.

Stakeholders have highlighted that these broader challenges have been further hindered by the lack of an effective vision and delivery roadmap from ONS. A near-universal view among external stakeholders, and generally acknowledged by ONS staff, was that progress on the use of administrative data had been much slower than expected at the time of Bean review. Previous OSR assessments, including those of the ABS and GDP revisions, noted the scope to make better use of administrative data, whether as a direct source of data or to inform sample design.

Slow progress in the use of administrative data is particularly problematic in the context of reducing survey response rates. While noting that many of barriers may lie with the suppliers of data in other government departments, ONS staff observed that there was common acceptance that the use of administrative data should increase but no systematic assessment of the role that these data could play across output areas and the implications for the future role of surveys.

Some staff thought that in the past, administrative data had often been regarded and presented simplistically as a direct replacement for survey data. While this could sometimes be the case, the reality was often that a more complex solution was required, for example for quality assurance, sampling, methods or survey question replacement, with administrative data playing different roles in different areas. In some areas, surveys would remain the dominant data source. In other areas, there would a continuing role for surveys as a complementary source of information and, where administrative sources were dominant, to reduce overall response bias (since administrative sources often have their own sources of bias).

It follows that developing an overarching vision and strategy is important but challenging because there are potentially many ways of using administrative data, and options will vary for each output. Assessing the options requires expertise in individual outputs. At the same time, the efficient acquisition and exploitation of administrative data requires a joined-up and strategic approach drawing on the expertise of those with experience of the data sources in question.

Back to top

Greater strategic clarity of purpose and transparency on prioritisation would help reassure external stakeholders

ONS has taken steps to prioritise and maintain funding for economic statistics and data sources. But it is operating in a challenging financial climate across the public sector. In an environment of funding pressures, ONS can increase the confidence of its stakeholders by providing clearer explanations of the pressures it faces, the priorities it has set and the resource allocation decisions it makes.

After increases earlier years, there was a fall in staff and budgets for data collection. Insufficient investment is a key factor in the data quality issues that have subsequently emerged. It is harder to draw firm conclusions on the funding of economic statistics production overall.

In this context, we consider that there has been a lack of transparency about what ONS regards as its core purpose for economic statistics, and ONS staff and stakeholders have expressed concern about the effectiveness of its decision-making in allocating resources.

In its most recent annual business planning cycles, ONS has sought to prioritise funding for economic statistics, and has reduced some of its outputs to focus on its core priorities for economic statistics. Again, it would help enhance stakeholder confidence if these decisions were more clearly communicated as part of an overall strategy for delivering economic statistics.

In short, a more engaged and transparent process would assist users of all ONS’s statistics in understanding the trade-offs ONS faces and how it has addressed them.

ONS has an internal business planning process and tools to aid prioritisation at an operational level, but the process is not transparent to, or engaging of, stakeholders. So, while ONS’s engagement on individual statistics has improved, and is often very effective, there is little coherent engagement with stakeholders on the overall priorities that ONS should follow in maintaining and developing economic statistics. The lack of systematic engagement may also have contributed to the challenges ONS has faced in managing stakeholder demands on individual surveys as stakeholders have not been appraised of the implications of overall resource constraints and the opportunity costs of choices made. In short, a more engaged and transparent process would assist users of all ONS’s statistics in understanding the trade-offs ONS faces and how it has addressed them.

After the Bean review, a strategy for economic statistics and analysis was published and updated, with the last published in 2019. ONS publishes an annual Strategic Business Plan. However, this plan is very high level. It includes a commitment to publish high-quality statistics on GDP, prices and employment and refers to the Ambitious, Radical, Inclusive Economic Statistics (ARIES) programme, which ‘will improve our core statistical offering and maintain international standards and comparability, in line with user needs, and exploit new data sources and innovative methods to inform better quality, more timely and relevant statistics’. There is relatively little publicly available information on the detail of this programme or on how it will shape the allocation of resources across outputs. Few, if any, tangible and dated commitments are made.

Recently, ONS has placed the so-called “Big 4” output areas at the centre of its business planning process. Three areas identified within the Big 4 (GDP, prices and the labour market) lie within the scope of this review and the fourth, population statistics, plays an important role in economic statistics, including for weighting sampling frames. ONS has taken initial steps to identify the allocation of resources to each of these output areas in total. This work represents useful progress, and we recommend that ONS build on it by providing more granularity below the top-level output areas, with sufficient information on the costs and benefits of specific continuing and potential outputs to permit effective stakeholder engagement.

Back to top

Emerging issues with stakeholder engagement and organisational factors

A range of issues related to stakeholder engagement and ONS organisational factors have been raised through our engagement with stakeholders and our review of ONS-published material. These issues will be considered in more depth in the review’s next stages. These issues include the following:

  • It has become clear from engagement with stakeholders that communicating complex economic statistics remains challenging. For example, stakeholders are not always clear on how far monthly GDP statistics are based on turnover proxies rather than being direct measures of GDP, even though this information is generally discoverable from published documents. And transparency in the annual accounts could be improved by publishing, and exploring in more detail, the differences in GDP levels that result from the three methods of calculating GDP, how these differences are resolved through adjustments within the supply and use framework, and how those differences have evolved over time. This greater transparency would build on the good practice in the quarterly accounts, where differences in growth across the three methods are transparently confronted. Greater clarity on other adjustments, for example for quality and balancing purposes, would also be valuable.
  • Some stakeholders also pointed to a lack of adequate, up-to-date and sufficiently detailed material in quality and methodology information. Such information should provide users with sufficient knowledge to be able to reproduce the most recent results were they to have access to input data.
  • ONS needs to engage with stakeholders on its whole portfolio of outputs, rather than focusing solely on specific statistics or surveys, as some needs may be better met by other parts of the portfolio than currently. For example, the needs for hourly earnings in combination with demographic information data may be better met by the Family Resources Survey (FRS), or perhaps administrative sources, than the LFS, where questions on income may inhibit response, although the challenges associated with drawing data for productivity statistics from different sources would need to be addressed.

As we have been finalising this report, the UK Statistics Authority announced a review of ONS’s performance and culture to be conducted by Sir Robert Devereux. The review is likely to cover the organisation-wide issues, and we will consider its conclusions as we conduct follow-up work to our report.

Back to top
Download PDF version (582.98 KB)