What is Levelling Up and how will its success be measured?

‘Levelling Up’ is now a term used daily in the media – but what does it mean and how are we, as the statistics watchdog, going to monitor its impact? Statistics Regulator Ben Bohane discusses…

Not long before I joined OSR as a Regulator, the Conservative Party made ‘Levelling Up’ a key part of their 2019 election manifesto. It focused on challenging and changing the geographical inequality in the UK, through investment and new infrastructure to allow ‘everyone the opportunity to flourish’.

Prior to working at OSR, I taught Economics to young people as a Secondary School teacher. I found teaching young people how changes in the economy and government spending might impact on their lives – really rewarding. I think back to those young people now – do they understand what Levelling Up is amongst the media hype? How will the proposals that are outlined in the Levelling Up White Paper impact their lives and futures?

If I were to explain it to my students now, I would describe Levelling Up as a plan to eradicate regional disparities in the UK, raise living standards and provide greater opportunities to people, in communities and areas that have so far not had the success of more prosperous parts of the country.

But with confusion over the concept really means – how can we measure something for which the success means different things to different people? Back in March, OSR’s Director Ed Humpherson wrote about ‘Why I Love Evaluation’ stating that evaluation “provides evidence of what works; it supports good policy; it builds the skills and reputation of analysts; it helps scrutiny.” Ongoing evaluation of Levelling Up will be key to its success.

In OSR our focus is on ensuring existing statistics that can be used to measure the success of government policy are of sufficient trustworthiness, quality and provide public value. But also, that statistics are available in the first place. As the government highlights in the White Paper, many of the metrics that will be used to measure the success of Levelling Up are either not yet available or of insufficient quality. The clarity of what’s being measured is important if people want to track progress through data.

In OSR we’ve already been working on public interest in regional disparities and fighting for a statistical system that is more responsive to regional and local demands. We have:

Our Business Plan highlights that we have seen a growing public expectation that decisions affecting all aspects of our lives will be evidenced by trustworthy and accessible statistics. Over the coming months and years, we will continue to review new statistics and data sources from the Department for Levelling Up, Housing and Communities, Office for National Statistics and other data providers as they are developed to ensure that evidence and evaluation is at the forefront of pushing the plans forward.

Our regulatory programme for this year focuses on projects that will improve public understanding of the issues, current and emerging, that people want to be sighted on. As the statistics regulator, reviewing the statistics used in Levelling Up, we will be tracking the implementation of the GSS Sub National Data strategy and new tools such as the ONS Sub National Indicators Explorer, ensuring statistics are the best quality they can be and clearly focussed on measuring the outlined Levelling Up missions.

Statistics supported by clear analysis and evaluation will provide the evidence to measure the impacts, successes and failures of Levelling Up – and any future government policies to address regional disparities and improve people’s lives. As the government implements policies to address regional inequalities – and businesses and households respond – we will focus on ensuring that the statistics both accurately measure and live up to this ambitious long-term strategy. It is important to me as a statistics regulator that we do this. After all, the vision of Levelling Up is so important to the futures of those young people I used to teach.


Appendix 1: Background foundation work on surveys that are used to produce economic statistics

ONS Purchases Survey statistics  (December 2019):

ONS reintroduce the survey following the National Statistics Quality Review of National Accounts to provide better information about purchasing patterns by business.

Our review found that the quality of outputs from the survey is still being improved, which reflected ONS’s own narrative that it would normally be several years before a new survey was producing statistics that could be used with confidence. It also reflected that the Annual Purchases Survey aims to collect variables that do not naturally fit with many businesses’ operational models. Our report noted the discrepancy between estimates of intermediate consumption derived from the Annual Purchases Survey and the Annual Business Survey. We said that it is an essential part of demonstrating that the quality of the statistics meets users’ needs that these differences are understood, explained well, and are used to further improve the statistics.

ONS UK Business Demography statistics (October 2020)

We reviewed ONS Business Demography statistics because we felt they should be considered key economic indicators. They are not regarded as such because they are not as good or as useful as they should be. The ONS’s business register – the Inter-Departmental Business Register (IDBR) – holds a wealth of data on the UK’s business population, some of which are used to produce business demography statistics. The remainder of which remain a largely untapped resource. In response to the COVID pandemic, ONS introduced a weekly indicator of business births and deaths and introduced a quarterly series of experimental business demography statistics. These innovations presented a platform for further development of the statistics. However some required improvements to the statistics rely on significant investment and we said that work to develop ONS’s business register should urgently be re-introduced to ensure that users’ needs for business population statistics are met. In our review we made several short-term recommendations for ONS:

  • demonstrate progress in understanding the access difficulties users are experiencing when using and linking IDBR data with data
  • publish its plans for publishing more timely business demography statistics, and its plans for developing the recently introduced quarterly experimental statistics
  • publish a narrative covering what ONS already knows about the range of key data quality issues, building on the supporting quality information provided with the new quarterly experimental statistics
  • publish its plans to restart and resource work to develop its business register

We also said in the longer term, ONS should publish a plan which includes specific actions, deliverables and a timetable that explains how it will address the improvements identified in the report, including plans for reviewing the funding of the Statistical Business Register.

ONS Annual Business Survey statistics (September 2021)

We reviewed ONS ABS and found that the significant time delay on the publication of ABS data means that the data are not always used to measure the ongoing impacts of structural and cyclical changes to the UK economy. As a result, ABS data are not fully meeting users’ needs for timely and detailed data on business performance.

We found ONS focus and priority on transforming short-term surveys means there has been a lack of investment in finance, staff, and systems and so ABS data have been unable to keep up with changing demands on their use. The lack of investment has curtailed ONS’s efforts to improve the detail and timeliness of ABS data.

We found a lack of investment has been a common theme of OSR’s recent assessments of ONS’s structural economic surveys and statistics. We strongly urged ONS to revisit the investment needs of these outputs, to ensure structural economic data are available to assess, for example, the ongoing impact of the economic shocks of Brexit and the pandemic.

Appendix 2: OSR work on regional statistics and Levelling Up

ONS Statistics on Regional Gross Value Added (August 2017)

“Many of the R-GVA users that we spoke to cited poor timeliness as a limitation of these statistics” and “that unless the R-GVA statisticians find new sources that provide the same level of detailed information more quickly than the current sources (which they indicated to us is unlikely in the short term), the timeliness of these statistics is unlikely to change significantly”.

“ONS might do more to bring out the differences between the regions through the proportions of people in the region who are economically inactive, which can affect the GVA per head statistics and the impact of commuting on the statistics” and requested ONS “to work with its national and regional stakeholders to bolster the statistical services such as information, advice and guidance available to provide even greater insight in sub-regions (particularly new city-regions) and in preparing contextual information to aid regional and country media in interpreting the statistics”.

ONS Statistics on Regional Gross Value Added (Phase Two) (June 2018)

We asked ONS to make further improvements, for example, “investigate whether improvements in the quality of deflators by adopting regional price statistics could be achieved technically and cost-effectively taking account of expected use of the statistics and user need”. We also asked ONS to “review the best way of making quality metrics both more useable to a less expert audience and more accessible generally”.

HM Treasury Statistics on Government Spending: Country and Regional Analysis (May 2019)

We asked HM Treasury to:

  • collaborate with producers of other public finance statistics and with analysts in the countries and regions to seek views, update their understanding of users’ needs to better support the use of these statistics
  • communicate effectively with the widest possible audience to increase awareness of the statistics and data
  • present CRA data in a more engaging way that supports and promotes use by all types of users and those with interests in spending at programme and service levels (sub functional levels)
  • test the strength of user need for CRA on a ‘where-benefits’ basis, examine the feasibility of collecting data on this basis and the trade-off between enhanced functionality and increased burden on data suppliers
  • provide a clear and comprehensive account in each annual CRA publication to allocation methods, including the inclusion of links to published documents about allocation methods in respect to all ongoing major project spending
  • ensure that users are provided with appropriate insights about changes in the data. This should include helping users understand impacts on the CRA data and provide links, when applicable, to other output areas where information on Brexit impacts has already been published
  • establish a development programme for these statistics and periodically review that programme; be open about progress towards meeting priorities and objectives; and arrange for users and other stakeholders to be involved in prioritising statistical plans
  • strengthen its arrangements for reviewing requests to allow pre-release access to new people; review the current list of those with pre-release access for CRA, with a view to minimising the numbers of individuals included and inform the Authority of the justification for each inclusion

ONS Experimental statistics on Regional Household Final Consumption Expenditure (HFCE) (January 2021)

We highlighted the potential of HFCE estimates as a highly important component in fully understanding regional economies. Prior to this there were no regional estimates of the expenditure measure of GDP, except in Scotland, a topic we previously highlighted in our 2020 submission to the Treasury Select Committee’s inquiry into Regional Imbalances.

DLUHC Levelling Up Fund prospectus (March 2021)

The Levelling Up Fund prospectus included a list of local authorities by priority category.

However, initially no description of the methodology used was attached to enable users to understand how local authorities were allocated to priorities areas. A week later a DLUHC published a methodology document, but it was still not possible to recreate the full dataset used to allocate local authorities to priorities areas.

We wrote to DLUHC publicly highlighting our concerns about the transparency of data related to the Levelling Up Fund and we requested DLUHC publish data that supported the allocation of priorities areas to enhance public confidence in the decisions that were being made.

As a result, DLUHC published the Levelling Up Fund: prioritisation of places model, which showed all the steps that were taken when using data to assign Local Authorities in England, Scotland and Wales to categories 1, 2 and 3. The spreadsheet included a “data and input construction” tab which included links to the source data with explanations of the source and why it was chosen.

ONS Foreign Direct Investment (FDI)  Statistics and DIT Inward Investment Statistics (April 2021)

As a result of our review new questions were added to the quarterly and annual FDI surveys, to collect more-granular data on sub-national FDI and ONS is now publishing experimental UK sub-national FDI statistics.

NISRA BESES statistics (December 2021)

As a result of our review, NISRA will be publishing more timely imports data and has developed an interactive dashboard that provides more-granular monthly international trade data on products.

ONS Income Estimates for Small Areas statistics (January 2022)

We suggested how further value could be added by ONS understanding the needs of current non-users who require income estimates at lower levels of geography.

Users’ want to be able to aggregate estimates for lower-level super output areas into bespoke geographies, the estimates are given for middle-layer super output areas which are too large for users’ needs.

DLUHC planning applications in England statistics and at the same time Homes England Housing Statistics (March 2022)

We felt at the time it was likely that planning performance and planning reforms will in some part be included in new Levelling up legislation, given its assumed focus on local area development.

LA planning application performance at the time had also been identified as a priority departmental outcomes metric in the 2021 Spending Review.

We advised further developments to the statistics. One of these developments included sub-national commentary, which should be introduced to help explore, for example, trends in planning to support regeneration in the 20 English towns and cities prioritised in the Levelling Up white paper.

We found the statistics could be further enhanced if Homes England were to publish information about aspects of quality, for example, limitations of data sources, quality assurance checks carried out by data suppliers, and the team’s assessment of data quality against our quality assurance of administrative data (QAAD) matrix

We also asked Homes England to consider how any uncertainty in the statistics might be more clearly communicated to users, as the latest data are provisional and subject to revision.

Finally, we suggested further insight and context should be added by enhancing the narrative and analysis provided for users who wish to explore the topic further.

Appendix 3: Treasury Committee evidence

2019 response

Key point

There is a range of official statistics on regional economic performance. They should be considered alongside other forms of data published by Government and others.

What we said

All data, whether classified as official statistics or not, should seek to adhere to high standards of trustworthiness, quality and value (which we describe as voluntary adoption of the Code of Practice’s pillars).

Key point

There are some limitations to the current data sources, both in terms of data gaps and in terms of quality.

What we said

In our written evidence referring to regional economic data, we highlighted “the quality of regional data is affected by the granularity that the data sources can provide, and/or the timeliness of the data provision”. Regional data is more volatile than national estimates and there are significant challenges in forming regional estimates of GDP.

We said “In arriving at aggregate estimates, statisticians often combine both administrative and survey data sources….and then disaggregate to provide regional breakdowns (a top-down approach). Survey data is often limited in its depth: for example, the data used to compile R-GVA can become stretched at lower geographies, becoming increasingly volatile as it is disaggregated further.”

Key point

There is a significant use of modelled data, which apportions national data to regions using formulae, rather than directly observed data, which would be gathered at the local level.

What we said

“During our regulatory work, we received feedback from users of regional and sub-regional economic data expressing concern that they can’t tell whether the data they are using is based on observed economic behaviour or come from modelled estimates. They view data based on observed estimates as more reliable than modelled estimates”.

At our request, the ONS conducted research into how much data measuring economic growth are directly observed at a regional level and collected in a way that can be immediately and wholly assigned to a single region, and how much data are modelled to provide regional estimates.

Key point

It may be worth considering a network of regional statistical observatories, akin to the Bank of England’s regional agents, that can help provide ONS and others with better insight into regional economic issues.

What we said

We wanted to highlight the benefits of a presence outside the offices of London, Newport and Titchfield – both to better understand the dynamic of regional economies, and to be closer to users with a regional focus (like combined mayoral authorities).

2020 response

“Developments [in regional statistics] will be enabled by better access to administrative data, where ONS can provide enhanced (ideally flexible) geographies with more use of direct estimation”.

“Regional performance information published by the UK Government can be found in some departmental annual reports and accounts but is not summarised in any compendium”.

We laid out several important conditions for publishing regional economic forecasts that could be adopted to help people make judgements about the UK and regional economy.

One of these conditions was: “It would be important to communicate the uncertainties associated with any regional GVA forecasts. For example, there are deficiencies in historical GVA data. Forecasts will only be as good as the data they rely on”.

The productivity puzzle: Looking at how productivity in the UK is measured

This is a guest blog from Stuart McIntyre, Head of Research at the Fraser of Allander Institute.

Productivity is a term that economists intuitively understand, but it can often be difficult for non-specialists to track and understand productivity changes.

Post the global financial crisis of 2008/09 (GFC), productivity in the UK has both underperformed relative to other developed economies and underperformed historic growth in productivity in the UK. This has come to be known as the ‘productivity puzzle’.

Why has UK productivity performance seemingly slumped? And why should people care?

UK productivity performance might seem like it doesn’t matter to the average worker – but this would be to misunderstand what it represents.

Improvements in productivity, producing more with the same or fewer inputs like hours worked, is a key element in unlocking improvements in wages for workers and competitiveness for businesses.

As the Nobel Prize winner Paul Krugman once said, “Productivity isn’t everything, but, in the long run, it is almost everything. A country’s ability to improve its standard of living over time depends almost entirely on its ability to raise its output per worker”. Indeed, one of the key reasons why living standards in the UK have stagnated over the last few years has been the weak performance of UK productivity.

Productivity matters, and for people trying to understand how the economy is performing productivity is a key indicator, and not just as a number, but also in comparison to other parts of the country and indeed other countries.

How is productivity measured in the UK?

The Office for National Statistics produce regular updates on UK productivity. This includes measures of labour productivity, output per hour worked and per job, as well as multi-factor productivity– which measures the change in economic output that cannot be explained by changes in inputs like labour and capital.

One challenge remains that many people take labour productivity measures to be the same as overall productivity, which ignores broader measures of productivity. In particular, our understanding of changes in productivity require us to look beyond what is happening to labour productivity, and look at whether businesses are investing. We also need to consider changes in innovation, regulation, and the business environment.

Since the GFC we’ve seen significant improvements in the quality and range of productivity statistics, and also significant investment by researchers inside and outside of government to understand why the UK’s productivity performance has been so poor.

Even among experts, there are competing assessments of the root cause and it seems like a number of factors might be important. Explanations range from weak investment in skills and training, as well as in research and development, through to factors causing weak demand in the economy. Another interesting explanation centres on whether part of the explanation lies in how we measure productivity.

All of which invites significant investment in our economic data – and since the GFC we have seen a lot of this take place.

We now have more and better productivity statistics in the UK. For example, we know much more about how productivity differs between firms, between regions, and between the UK and its international competitors.

How could we improve how productivity is measured?

We’ve seen some important measurement issues identified in the academic literature being addressed, as well as the testing of key ideas from the academic research about the role of management practice in explaining productivity performance, through the collection of new survey data from businesses.

Yet there remains lots still to do.

We’ve seen increasing use of so—called administrative data, like VAT returns, to measure what is happening in the economy in a timelier manner. These data developments are vital – and have an important role to play in measuring productivity.

At the same time, we know how important measurement issues can be in tracking productivity in particular parts of the economy – this is increasingly important as the economy undergoes significant change with digitalisation and growth of new technologies. It must remain a key area of focus.

One area of economic statistics where there is increasing investment and focus is sub-UK statistics – as the UK Government embarks upon its ‘Levelling Up’ agenda this will be increasingly important. In particular understanding what is happening in the regions and nations of the UK. This is an area where productivity statistics could also use further investment.

In particular, providing more timely estimates of regional productivity, but also by ensuring that analysis and statistics at a UK level are paralleled at a sub-national level. This also goes for measurement of factors that are important in explaining movements in productivity, for example what is happening to capital investment.

There is also scope to help non-expert users engage with these statistics. The past few years have seen growing use of infographics and social media – which has helped the latest data reach a broader audience.

More generally, we’ve seen increasing focus on setting out the uncertainty around economic statistics. It is clear from recent evidence that users of economic statistics benefit – in the form of improved understanding of what the statistics show – from having the uncertainty inherent in economic statistics explained to them. In my view, this is an area where the productivity statistics could also be enhanced.

In summary, the post GFC period has seen renewed interest and investment in understanding UK productivity. In particular investing in more local statistics, making greater use of administrative data, and in the communicating and explanation of these statistics to non-expert users. We’ve come a long way, but there remains much still to do.

The people behind the Office for Statistics Regulation in 2020

This year I’ve written 9 blogs, ranging from an exploration of data gaps to a celebration of the armchair epidemiologists. I was thinking of making it to double figures, setting out my reflections across a tumultuous year. And describing my pride in what the Office for Statistics Regulation team has delivered. But, as so often in OSR, the team is way ahead of me. They’ve pulled together their own year-end reflections into a short summary. Their pride in their work, and their commitment to the public good of statistics, really say far more than anything I could write; it’s just a much better summary.

So here it is (merry Christmas)

Ed Humpherson

Donna Livesey – Business Manager

2020 has been a hard year for everyone, with many very personally affected by the pandemic. Moving from a bustling office environment to living and working home alone had the potential to make for a pretty lonely existence, but I’ve been very lucky.

This year has only confirmed what a special group of people I work with in OSR. Everyone has been working very hard but we have taken time to support each other, to continue to work collaboratively to find creative solutions to new challenges, and to generously share our lives, be it our families or our menagerie of pets, all be it virtually.

I am so proud to work with a team that have such a passion for ensuring the public get the statistics and data they need to make sense of the world around them, while showing empathy for the pressures producers of statistics are under at this time.

We all know that the public will continue to look to us beyond the pandemic, as the independent regulator, to ensure statistics honestly and transparently answer the important questions about the longer term impacts on all aspects of our lives, and our childrens’ lives. I know we are all ready for that challenge, as we are all ready for that day when we can all get together in person.

 

Caroline Jones – Statistics Regulator, Health and Social Care Lead

2020 started off under lockdown, with the nation gripped by the COVID-19 pandemic and avidly perusing the daily number of deaths, number of tests, volume of hospitalisations and number of vaccines. This level of anxiety has pushed more people into contacting OSR to ask for better statistics, and it has been a privilege to work at the vanguard of the improvement to the statistics.

To manage the workload, the Health domain met daily with Mary (Deputy Director for Regulation) and Katy, who manages our casework, so we could coordinate the volume of health related casework we were getting in. We felt it important to deal sympathetically with statistic producers, who have been under immense pressure this year, to ensure they changed their outputs to ensure they were producing the best statistics possible. It’s been rewarding to be part of that improvement and change, but we still have a lot of work to do in 2021 to continue to advocate for better social and community care statistics.

 

Leah Skinner – Digital Communications Officer

As a communications professional who loves words, I very often stop and wonder how I ended up working in an environment with so many numbers. But if 2020 has taught me anything, it’s that the communication of those numbers, in a way that the public can understand, is crucial to make sure that the public have trust in statistics.

This has made me reflect on my own work, and I am more determined than ever to make our work, complex as it can be, as accessible and as understandable to our audiences as possible. For me, the highlight of this year has been watching our audience grow as we have improved our Twitter outputs and launched our own website. I really enjoy seeing people who have never reached out to us before contacting us to work with us, whether it be to do with Voluntary Application of the Code, or to highlight casework.

As truly awful as 2020 has been, it is clear now that the public are far more aware of how statistics affect our everyday lives, and this empowers us to ask more questions about the quality and trustworthiness of data and hold organisations to account when the data isn’t good enough.

 

Mark Pont – Assessment Programme Lead

For me, through the challenges of 2020, it’s been great to see the OSR team show itself as a supportive regulator. Of course we’ve made some strong interventions where these have been needed to champion the public good of statistics and data. But much of our influence comes through the support and challenge we offer to statistics producers.

We published some of our findings in the form of rapid regulatory review letters. However, much of our support and challenge was behind the scenes, which is just as valuable.

During the early days of the pandemic we had uncountable chats with teams across the statistical system as they wrestled with how to generate the important insights that many of us needed. All this in the absence of the usual long-standing data sources and while protecting often restricted and vulnerable workforces who were adapting to new ways of working. It was fantastic to walk through those exciting developments with statistical producers, seeing first-hand the rapid exploitation of new data sources.

2021 will still be challenging for many of us. Hopefully many aspects of life will start to return to something closer to what we were used to. But I think the statistical system, including us as regulators, will start 2021 from a much higher base than 2020 and I look forward to seeing many more exciting developments in the world of official statistics.

 

Emily Carless – Statistics Regulator, Children, Education and Skills Lead

2020 has been a challenging year for producers and users of children, education and skills statistics which has had a life changing impact on the people who the statistics are about.  We started the year polishing the report of our review of post-16 education and skills statistics and are finishing it polishing the report of our review of the approach to developing the statistical models designed for awarding grades.  These statistical models had a profound impact on young people’s lives and on public confidence in statistics and statistical models.

As in other domains, statistics have needed to be developed quickly to meet the need for data on the impact of the pandemic on children and the education system, and to inform decisions such as those around re-opening schools. The demand for statistics in this area continues to grow to ensure that the impact of the pandemic on this generation can be fully understood.

Thinking about quality when producing statistics

Quality means doing it right when no one is looking.” – Henry Ford

 

Official statistics inform government, the media and the public about the issues that matter most in society. To feel confident using official statistics, people must trust them: quality has an important part to play in earning this trust.

In April, we published a review of the quality of HMRC’s official statistics. HMRC invited us to carry out this review after identifying a significant error in one of its published National Statistics. The review provided an independent assessment of HMRC’s quality management approach and identified improvements to strengthen the quality of their official statistics.

We made nine recommendations, which HMRC has welcomed. Many of the recommendations will apply to other producers – not just to strengthen the quality of official statistics, but also to improve the quality of all analytical outputs.

This blog tells the story of the review and its findings, from the perspectives of HMRC and OSR. We hope to inspire other producers to think about how they can build on their own approach to quality, to ensure statistics meet the needs of the people who use them.

Jackie Orme, Programme Lead, HMRC

In 2019 HMRC identified an error in published corporation tax receipt statistics, which led to us having to make substantial revisions. This was a serious concern both internally for HMRC and for external users of HMRC statistics. In response we undertook a number of actions, including initiating an internal audit review and inviting OSR to review the principles and processes underpinning production of our official statistics.

The review by OSR was particularly important to us as statisticians and analysts in HMRC, to draw on expert and independent advice in improving our ways of working. While some of the findings could potentially be uncomfortable, the review would support our desire to take a broad and ambitious approach to improvement and the weight of OSR’s views and advice would give credence to the need for change.

The review was carried out efficiently and we were kept well-informed about progress. The OSR review team devoted lots of time to talking to staff and stakeholders to get their input and views, across all grades and professions. This level of involvement has been helpful to us subsequently in securing initial engagement and agreement to changes across the organisation. For example, in getting active support from senior HMRC leaders to implement recommendations, such as creating a new cross-cutting team as part of our analysis function to build on our existing approach to data quality and assurance.

The review has given us the opportunity to reflect on data quality issues and the importance of having robust data to produce high quality statistics and analysis. We have built a substantial programme of work to implement the recommendations and are starting to recruit people to the new team. Some recommendations will be straightforward to implement. For example, we have already started to review our statistics outputs, in order to make sure analytical resource is being used effectively.

In contrast, other recommendations are more challenging to implement, in particular, mapping the journeys of our data within the department. This will take significant combined effort by analysts, data providers and data processors.

As highlighted in the report, HMRC has some older systems for processing and storing its administrative data and the review has been helpful in emphasising how essential it is for analysts to be involved in discussions and decisions around the design of future systems. These sorts of insights from the report have helped us build a case for increased resource and forge stronger links with data providers, to work together to improve the quality of HMRC’s statistics and analysis.

Helen Miller-Bakewell, Project Manager, OSR

We were really pleased when HMRC asked us to do this review: in doing so, it showed a proactive and open approach to strengthening the quality of its official statistics.

It’s the first time we’ve done a piece of work that looks across all of a producer’s official statistics at once – although we have now done something similar with the Defra Group (The Department for the Environment and Rural Affairs and its agencies and public bodies), with a focus on user engagement. Normally, we look at one set of statistics in detail, or we review how statistics on a topic area come together to meet user needs. This was somewhere in the middle!

To inform the review, we spoke with a wide range of people involved in the production of official statistics in HMRC; analysts working on the statistics directly, managers who oversee them and a handful of people indirectly involved in the production process, who own and supply data.

The OSR team spent about an hour with each individual or team we interviewed, during which we asked lots of questions about the production process. This helped us to understand how the quality of statistical outputs was managed in HMRC, and the challenges analysts can face.

It turned out to be a useful process for the producer teams as well, and we were asked for our question list a couple of times, to help them think about the quality of their statistics in the future. We’ve now packaged up this question list in a published guidance document, so that all producers can benefit from it.

The findings of the review highlight the issues that big operational departments working with administrative data can face with respect to quality and will ring true for other Government departments. The recommendations stress the importance of analysts fully understanding the nature and quality of data they are working with, and of building effective working relationships with data providers or managers to facilitate this.

In addition, OSR champions a broad approach to quality assurance of data and statistics, and regular reviews of publications to ensure analytical resource is being used effectively. The report emphasises the importance of having analytical leaders that champion and support changes and innovations that can enhance quality, while recognising that analysts do not operate in isolation and that long-term improvements to quality management rely on understanding, values and responsibility being shared across organisations.

We’re pleased the review has been so helpful to HMRC. We would like to thank everyone who gave their time to speak with us during the review. Their cooperation and openness were key to us arriving at findings that resonate with analysts working in HMRC and recommendations that will have a lasting positive impact on the quality of HMRC statistics.

Rising to the challenge

One thing that has stood out in the Coronavirus pandemic is how quickly the Government Statistical System (GSS) has responded to the need for data to support important decisions made by the government during this timeIn just a matter of weeks, statisticians have repurposed existing statistics and have made use of new data sources. Before the crisis, this work might have taken months. HMRC’s Coronavirus Job Retention Scheme (CJRS) statistics and its Self-Employed Income Support Scheme statistics (SEISS) are among theseWrecently conducted a rapid review of these statistics and Ed Humpherson, Director General for Regulation, has written to HMRC’s Head of Profession for Statistics, Sean Whellamssupporting the approach taken to produce these statistics. 

In March this year, we wrote about how our assessment of employment and jobs statistics in effect captured the statistical world in miniature, microcosm, of the statistical system. The assessment surfaced many of the common issues that statistics producers face, highlighting recurring themes from our other regulatory work. Now we see a further glimpse of our statistical world in miniature, through the lens of our recent review of HMRC’s statistics. HMRC’s response to the need for information about these crucial schemes admirably demonstrates government statisticians rising to the key challenge of the times.  

There are two aspects of the statistical system which these statistics exemplify.

First, for us as statistical regulators, whether during a national crisis or otherwise, government statistics should (ianswer society’s questions in a timely way; and (ii) provide insights at a level of detail to be useful to users. Additionally, many questions cannot be answered without sharing and linking data. In the preparation, production and publishing of the CJRS and SEISS statistics, HMRC has displayed all of these elements.  

Naturally society’s questions have been about what the take-up of the schemes and the costs of the job protection schemes to the public purse. These interests reflect two essential aspects of the Government’s job protection schemes  speed and simplicity of support. CJRS was launched on 20 April and SEISS on 13 May. Initially HMRC tweeted daily information on both schemes – in CJRS information about the number of unique applicants and the number of furloughed jobs In SEISS, HMRC gave information about the total number of claims. HMRC tweeted also the value of the claims received in respect to both schemes. As claims to the schemes started to tail off HMRC moved to tweeting the data on a weekly basis. Releasing these statistics on a timely basis and at intervals that met the needs of users is a good example of the orderly release of statistics essential to building trust in new statistics. 

After just a few weeks of tweeting this important data, HMRC linked both the CJRS and the SEISS data with other pre-existing HMRC administrative data to provide further insights into CJRS claims by employer size (for SEISS breakdown of claims by age and gender)and breakdowns for both schemes by sector of the economy, and by geography. These new statistical breakdowns were published in statistical bulletins released less than two months after the launch of the CJRS and under a month after the launch of SEISS  quite  remarkable achievements.  

Second, we found HMRC to be working closely with users of this data to find out what they need to know from the statistics. HMRC is open with users about any aspects of uncertainty in its estimates labelling the statistics and analysis with a frank description of what the statistics summarise. Consistent and coherent standard geographic definitions are adopted to harmonise with related statistics and data. In explaining the methods used to produce the statistics, HMRC has been proportionate to the complexity of the methods themselves, reflecting the needs of different types of users and uses. 

In a further example of a statistical system working well, we as statistical regulators look for statistical producers to go beyond making statistics and data available fast but to also present clear, meaningful, and authoritative insights that serve the public good. There are many examples of how HMRC has done this but to select just one in the SEISS bulletin HMRC set out the numbers of self-employed people across the UK who are eligible for support. This information is published not only at the UK level but also iaccompanying tables at regional and sub-regional (local authority and parliamentary constituency) levels. As we pointed out in our Monitoring Review of Income and Earnings in 2015, timely data on self-employment income is a key data gap generally. These statistics help understand just a little bit more about the income and earnings of the self-employed at different locations around the UK in 2020 and are a step towards addressing this data gap. 

Normally when statistics producers publish new or innovative statistics, they have a period in which they can develop and improve the statistics. HMRC will only have a restricted period in which to do this for its CJRS and SEISS statisticsBy their nature, these schemes are temporary and we will probably only see small number of releases.  Quite how the statistics may change, for example with the advent of the Coronavirus Job Retention Bonus Scheme, is yet to be established. Society will pose further questions for statistics producers to answer. 

The times we live in call for an ongoing watchfulness, a continuing need to be agile and responsive to offer even further insight in the months and years going forward. What we found in this case study is replicated throughout the statistical system. With continuing watchfulness there’s little doubt that there will be further statistics, data and insights to help manage to the other side of this pandemic.