Power in numbers: A collaborative approach to exploring public perceptions of public good

Dr Mary Cowan, Research Specialist at OSR, and Shayda Kashef, Public Engagement Manager at ADR UK describe the motivations behind the recent report on public perceptions of the public good and discuss the benefits of research collaborations

Many organisations working in or around data are driven by research that can help to answer some of society’s most pressing questions, with the ultimate aim of serving the public good. ‘Public good’ is a phrase commonly used within the context of data and statistics; as people working in this space, we have an understanding of this phrase, but what do the public think this means?

This is an important question as the data we use for research and statistics either comes directly or indirectly from the public. So, this year, we worked in partnership to shed some light on this topic with the aim of developing a resource for others looking for similar answers.

In the Office for Statistics Regulation (OSR), we have a vision that statistics will serve the public good. To inform our assessments of whether statistics are serving the public good, we engage with users of statistics to gauge their opinion on how well their needs are being served. Armed with that information, and the pillars of our Code of Practice for Statistics, we then work with statistics producers to help them realise the full benefits of their statistics.

Similarly, at ADR UK our mission is to harness the potential of administrative data for research in the public interest. Administrative data is the public’s data: therefore, in addition to making sure this data is used ethically and responsibly, we have a duty to engage the public in how and why their data is used at every stage of our work. This is to ensure our work demonstrates trustworthiness and maximises the public benefit of administrative data research.

At ADR UK and OSR, we rarely engage with people who do not use data or statistics. They often don’t have a reason to engage with us or perhaps even know we exist. But we believe the views of the general public, on what the public good use of data for research and statistics means for them, is integral to achieving our missions.

For these reasons, we sought to engage with members of the public who had little or no formal knowledge of data or statistics. We recruited 68 participants from across the UK, in person and online, whose diverse, and sometimes contradictory, perspectives enabled us to interrogate our own understandings and practices in a new way. This is why this project provides so many important insights for us and others who work with data and statistics.

We attended every workshop and heard our participants speak eloquently with passion and interest as they articulated their views on public good, and their ideas for what data and statistics could be achieving for society.

While many people struggle to define ‘the public good’, considering it a vague term, our participants guided each other in thoughtful discussions where they teased out their ideas about what public good relates to. Ideas that emerged from these discussions included:

  • the public should be involved in decision making to maintain neutrality and avoid politicisation
  • data for research and statistics should aim to address inequalities
  • the public good use of data for research and statistics should be clearly communicated and should minimise harm
  • and best practice safeguarding frameworks should be universally applied for data sharing.

You can hear directly from our participants in the expansions of these findings in the full report. 

This was the first time both of our organisations collaborated on a large-scale research project. Public engagement can be resource-intensive but, by working together, we have achieved a milestone for both organisations which may have been impossible without the other’s support. Both organisations are heavily invested in serving the public good, and by working together it allowed us to examine our findings from different perspectives and consider the implications from both a data and a statistics perspective.

Publishing this report may signal the end of the project, but it is also another important step toward understanding how data for research and statistics serves the public good.

Weights and measures: how consultations relate to OSR’s role

In our latest blog, Director General Ed Humpherson responds to concerns raised with OSR regarding the recent consultation by Department for Business, Energy & Industrial Strategy on the Choice on units of measurements: marking and sales.

This blog was amended on 4 October 2022 to provide further clarity about question design

Since the start of the Government’s consultation on weights and measures, a lot of people have raised concerns about it with us at OSR. At the time of writing, we have received over 150 emails and many others have tagged us on Twitter regarding the recent consultation by the Department for Business, Energy & Industrial Strategy on the Choice on units of measurements: marking and sales that closed on 26 August 2022.

Before considering the specific question that has been raised with us, let’s first set out some background to how consultations relate to our role.

Consultations, done well, are a vital tool for government in developing policy. They can provide specific and qualitative insight to complement evidence from other sources like official statistics. They also help Government to understand different perspectives, foresee possible consequences and gather expert advice on implementation.

Our remit focuses on statistics. We set the standards for how Government collects and uses statistics. Consultations are relevant to our work in two senses. First, consultations can often draw on statistics, for example to illustrate the scale of an issue a policy intends to address. And consultations can also create statistical evidence – for example, the number of people responding to a particular question. In these senses, consultations are relevant to our work as OSR.

Turning to this particular consultation, the aim was to identify how the Government can give more choice to businesses and consumers over the units of measurement they use for trade, while ensuring that measurement information remains accurate.

However, when the consultation was launched, many stakeholders raised concerns surrounding the consultation questions and in particular question 3a. The question was as follows:

If you had a choice, would you want to purchase items (i) in imperial units? or (ii) in imperial units alongside a metric equivalent.

There was no option to select an outcome without imperial units at all. People felt that this was a potentially biased way of collecting information on public views on changes to measurement.

Given the concerns raised with us, we approached BEIS as the Department conducting the consultation. We wanted to understand the reasons for this question. They advised us that UK law currently requires metric units to be used for all trade purposes, with only limited exceptions. The purpose of the consultation was to identify how they can give greater choice to businesses and consumers over the units of measurement they use to buy and sell products. BEIS did also say that respondents had multiple opportunities to give a range of views through the consultation, and that all responses would be carefully considered.

This explanation was helpful. But it still doesn’t make this a good question design because it doesn’t offer a complete range of responses. For example including a ‘none of the above’ option would have allowed respondents the opportunity to express a preference for a unit of measurement other than imperial or imperial with metric. In the absence of such an option, the design in effect makes this an overly leading question.

So, what happens next? We would be surprised to see the evidence drawn from this specific question being used in any kind of formal quantitative way. If the results of this poorly designed question were presented in statistical terms (x% of people said…etc), then this would represent the generation of a statistical claim from a consultation. And in our view it would be potentially quite misleading as a representation of levels of support for any proposed policy change.

What is Levelling Up and how will its success be measured?

‘Levelling Up’ is now a term used daily in the media – but what does it mean and how are we, as the statistics watchdog, going to monitor its impact? Statistics Regulator Ben Bohane discusses…

Not long before I joined OSR as a Regulator, the Conservative Party made ‘Levelling Up’ a key part of their 2019 election manifesto. It focused on challenging and changing the geographical inequality in the UK, through investment and new infrastructure to allow ‘everyone the opportunity to flourish’.

Prior to working at OSR, I taught Economics to young people as a Secondary School teacher. I found teaching young people how changes in the economy and government spending might impact on their lives – really rewarding. I think back to those young people now – do they understand what Levelling Up is amongst the media hype? How will the proposals that are outlined in the Levelling Up White Paper impact their lives and futures?

If I were to explain it to my students now, I would describe Levelling Up as a plan to eradicate regional disparities in the UK, raise living standards and provide greater opportunities to people, in communities and areas that have so far not had the success of more prosperous parts of the country.

But with confusion over the concept really means – how can we measure something for which the success means different things to different people? Back in March, OSR’s Director Ed Humpherson wrote about ‘Why I Love Evaluation’ stating that evaluation “provides evidence of what works; it supports good policy; it builds the skills and reputation of analysts; it helps scrutiny.” Ongoing evaluation of Levelling Up will be key to its success.

In OSR our focus is on ensuring existing statistics that can be used to measure the success of government policy are of sufficient trustworthiness, quality and provide public value. But also, that statistics are available in the first place. As the government highlights in the White Paper, many of the metrics that will be used to measure the success of Levelling Up are either not yet available or of insufficient quality. The clarity of what’s being measured is important if people want to track progress through data.

In OSR we’ve already been working on public interest in regional disparities and fighting for a statistical system that is more responsive to regional and local demands. We have:

Our Business Plan highlights that we have seen a growing public expectation that decisions affecting all aspects of our lives will be evidenced by trustworthy and accessible statistics. Over the coming months and years, we will continue to review new statistics and data sources from the Department for Levelling Up, Housing and Communities, Office for National Statistics and other data providers as they are developed to ensure that evidence and evaluation is at the forefront of pushing the plans forward.

Our regulatory programme for this year focuses on projects that will improve public understanding of the issues, current and emerging, that people want to be sighted on. As the statistics regulator, reviewing the statistics used in Levelling Up, we will be tracking the implementation of the GSS Sub National Data strategy and new tools such as the ONS Sub National Indicators Explorer, ensuring statistics are the best quality they can be and clearly focussed on measuring the outlined Levelling Up missions.

Statistics supported by clear analysis and evaluation will provide the evidence to measure the impacts, successes and failures of Levelling Up – and any future government policies to address regional disparities and improve people’s lives. As the government implements policies to address regional inequalities – and businesses and households respond – we will focus on ensuring that the statistics both accurately measure and live up to this ambitious long-term strategy. It is important to me as a statistics regulator that we do this. After all, the vision of Levelling Up is so important to the futures of those young people I used to teach.

Appendix 1: Background foundation work on surveys that are used to produce economic statistics

ONS Purchases Survey statistics  (December 2019):

ONS reintroduce the survey following the National Statistics Quality Review of National Accounts to provide better information about purchasing patterns by business.

Our review found that the quality of outputs from the survey is still being improved, which reflected ONS’s own narrative that it would normally be several years before a new survey was producing statistics that could be used with confidence. It also reflected that the Annual Purchases Survey aims to collect variables that do not naturally fit with many businesses’ operational models. Our report noted the discrepancy between estimates of intermediate consumption derived from the Annual Purchases Survey and the Annual Business Survey. We said that it is an essential part of demonstrating that the quality of the statistics meets users’ needs that these differences are understood, explained well, and are used to further improve the statistics.

ONS UK Business Demography statistics (October 2020)

We reviewed ONS Business Demography statistics because we felt they should be considered key economic indicators. They are not regarded as such because they are not as good or as useful as they should be. The ONS’s business register – the Inter-Departmental Business Register (IDBR) – holds a wealth of data on the UK’s business population, some of which are used to produce business demography statistics. The remainder of which remain a largely untapped resource. In response to the COVID pandemic, ONS introduced a weekly indicator of business births and deaths and introduced a quarterly series of experimental business demography statistics. These innovations presented a platform for further development of the statistics. However some required improvements to the statistics rely on significant investment and we said that work to develop ONS’s business register should urgently be re-introduced to ensure that users’ needs for business population statistics are met. In our review we made several short-term recommendations for ONS:

  • demonstrate progress in understanding the access difficulties users are experiencing when using and linking IDBR data with data
  • publish its plans for publishing more timely business demography statistics, and its plans for developing the recently introduced quarterly experimental statistics
  • publish a narrative covering what ONS already knows about the range of key data quality issues, building on the supporting quality information provided with the new quarterly experimental statistics
  • publish its plans to restart and resource work to develop its business register

We also said in the longer term, ONS should publish a plan which includes specific actions, deliverables and a timetable that explains how it will address the improvements identified in the report, including plans for reviewing the funding of the Statistical Business Register.

ONS Annual Business Survey statistics (September 2021)

We reviewed ONS ABS and found that the significant time delay on the publication of ABS data means that the data are not always used to measure the ongoing impacts of structural and cyclical changes to the UK economy. As a result, ABS data are not fully meeting users’ needs for timely and detailed data on business performance.

We found ONS focus and priority on transforming short-term surveys means there has been a lack of investment in finance, staff, and systems and so ABS data have been unable to keep up with changing demands on their use. The lack of investment has curtailed ONS’s efforts to improve the detail and timeliness of ABS data.

We found a lack of investment has been a common theme of OSR’s recent assessments of ONS’s structural economic surveys and statistics. We strongly urged ONS to revisit the investment needs of these outputs, to ensure structural economic data are available to assess, for example, the ongoing impact of the economic shocks of Brexit and the pandemic.

Appendix 2: OSR work on regional statistics and Levelling Up

ONS Statistics on Regional Gross Value Added (August 2017)

“Many of the R-GVA users that we spoke to cited poor timeliness as a limitation of these statistics” and “that unless the R-GVA statisticians find new sources that provide the same level of detailed information more quickly than the current sources (which they indicated to us is unlikely in the short term), the timeliness of these statistics is unlikely to change significantly”.

“ONS might do more to bring out the differences between the regions through the proportions of people in the region who are economically inactive, which can affect the GVA per head statistics and the impact of commuting on the statistics” and requested ONS “to work with its national and regional stakeholders to bolster the statistical services such as information, advice and guidance available to provide even greater insight in sub-regions (particularly new city-regions) and in preparing contextual information to aid regional and country media in interpreting the statistics”.

ONS Statistics on Regional Gross Value Added (Phase Two) (June 2018)

We asked ONS to make further improvements, for example, “investigate whether improvements in the quality of deflators by adopting regional price statistics could be achieved technically and cost-effectively taking account of expected use of the statistics and user need”. We also asked ONS to “review the best way of making quality metrics both more useable to a less expert audience and more accessible generally”.

HM Treasury Statistics on Government Spending: Country and Regional Analysis (May 2019)

We asked HM Treasury to:

  • collaborate with producers of other public finance statistics and with analysts in the countries and regions to seek views, update their understanding of users’ needs to better support the use of these statistics
  • communicate effectively with the widest possible audience to increase awareness of the statistics and data
  • present CRA data in a more engaging way that supports and promotes use by all types of users and those with interests in spending at programme and service levels (sub functional levels)
  • test the strength of user need for CRA on a ‘where-benefits’ basis, examine the feasibility of collecting data on this basis and the trade-off between enhanced functionality and increased burden on data suppliers
  • provide a clear and comprehensive account in each annual CRA publication to allocation methods, including the inclusion of links to published documents about allocation methods in respect to all ongoing major project spending
  • ensure that users are provided with appropriate insights about changes in the data. This should include helping users understand impacts on the CRA data and provide links, when applicable, to other output areas where information on Brexit impacts has already been published
  • establish a development programme for these statistics and periodically review that programme; be open about progress towards meeting priorities and objectives; and arrange for users and other stakeholders to be involved in prioritising statistical plans
  • strengthen its arrangements for reviewing requests to allow pre-release access to new people; review the current list of those with pre-release access for CRA, with a view to minimising the numbers of individuals included and inform the Authority of the justification for each inclusion

ONS Experimental statistics on Regional Household Final Consumption Expenditure (HFCE) (January 2021)

We highlighted the potential of HFCE estimates as a highly important component in fully understanding regional economies. Prior to this there were no regional estimates of the expenditure measure of GDP, except in Scotland, a topic we previously highlighted in our 2020 submission to the Treasury Select Committee’s inquiry into Regional Imbalances.

DLUHC Levelling Up Fund prospectus (March 2021)

The Levelling Up Fund prospectus included a list of local authorities by priority category.

However, initially no description of the methodology used was attached to enable users to understand how local authorities were allocated to priorities areas. A week later a DLUHC published a methodology document, but it was still not possible to recreate the full dataset used to allocate local authorities to priorities areas.

We wrote to DLUHC publicly highlighting our concerns about the transparency of data related to the Levelling Up Fund and we requested DLUHC publish data that supported the allocation of priorities areas to enhance public confidence in the decisions that were being made.

As a result, DLUHC published the Levelling Up Fund: prioritisation of places model, which showed all the steps that were taken when using data to assign Local Authorities in England, Scotland and Wales to categories 1, 2 and 3. The spreadsheet included a “data and input construction” tab which included links to the source data with explanations of the source and why it was chosen.

ONS Foreign Direct Investment (FDI)  Statistics and DIT Inward Investment Statistics (April 2021)

As a result of our review new questions were added to the quarterly and annual FDI surveys, to collect more-granular data on sub-national FDI and ONS is now publishing experimental UK sub-national FDI statistics.

NISRA BESES statistics (December 2021)

As a result of our review, NISRA will be publishing more timely imports data and has developed an interactive dashboard that provides more-granular monthly international trade data on products.

ONS Income Estimates for Small Areas statistics (January 2022)

We suggested how further value could be added by ONS understanding the needs of current non-users who require income estimates at lower levels of geography.

Users’ want to be able to aggregate estimates for lower-level super output areas into bespoke geographies, the estimates are given for middle-layer super output areas which are too large for users’ needs.

DLUHC planning applications in England statistics and at the same time Homes England Housing Statistics (March 2022)

We felt at the time it was likely that planning performance and planning reforms will in some part be included in new Levelling up legislation, given its assumed focus on local area development.

LA planning application performance at the time had also been identified as a priority departmental outcomes metric in the 2021 Spending Review.

We advised further developments to the statistics. One of these developments included sub-national commentary, which should be introduced to help explore, for example, trends in planning to support regeneration in the 20 English towns and cities prioritised in the Levelling Up white paper.

We found the statistics could be further enhanced if Homes England were to publish information about aspects of quality, for example, limitations of data sources, quality assurance checks carried out by data suppliers, and the team’s assessment of data quality against our quality assurance of administrative data (QAAD) matrix

We also asked Homes England to consider how any uncertainty in the statistics might be more clearly communicated to users, as the latest data are provisional and subject to revision.

Finally, we suggested further insight and context should be added by enhancing the narrative and analysis provided for users who wish to explore the topic further.

Appendix 3: Treasury Committee evidence

2019 response

Key point

There is a range of official statistics on regional economic performance. They should be considered alongside other forms of data published by Government and others.

What we said

All data, whether classified as official statistics or not, should seek to adhere to high standards of trustworthiness, quality and value (which we describe as voluntary adoption of the Code of Practice’s pillars).

Key point

There are some limitations to the current data sources, both in terms of data gaps and in terms of quality.

What we said

In our written evidence referring to regional economic data, we highlighted “the quality of regional data is affected by the granularity that the data sources can provide, and/or the timeliness of the data provision”. Regional data is more volatile than national estimates and there are significant challenges in forming regional estimates of GDP.

We said “In arriving at aggregate estimates, statisticians often combine both administrative and survey data sources….and then disaggregate to provide regional breakdowns (a top-down approach). Survey data is often limited in its depth: for example, the data used to compile R-GVA can become stretched at lower geographies, becoming increasingly volatile as it is disaggregated further.”

Key point

There is a significant use of modelled data, which apportions national data to regions using formulae, rather than directly observed data, which would be gathered at the local level.

What we said

“During our regulatory work, we received feedback from users of regional and sub-regional economic data expressing concern that they can’t tell whether the data they are using is based on observed economic behaviour or come from modelled estimates. They view data based on observed estimates as more reliable than modelled estimates”.

At our request, the ONS conducted research into how much data measuring economic growth are directly observed at a regional level and collected in a way that can be immediately and wholly assigned to a single region, and how much data are modelled to provide regional estimates.

Key point

It may be worth considering a network of regional statistical observatories, akin to the Bank of England’s regional agents, that can help provide ONS and others with better insight into regional economic issues.

What we said

We wanted to highlight the benefits of a presence outside the offices of London, Newport and Titchfield – both to better understand the dynamic of regional economies, and to be closer to users with a regional focus (like combined mayoral authorities).

2020 response

“Developments [in regional statistics] will be enabled by better access to administrative data, where ONS can provide enhanced (ideally flexible) geographies with more use of direct estimation”.

“Regional performance information published by the UK Government can be found in some departmental annual reports and accounts but is not summarised in any compendium”.

We laid out several important conditions for publishing regional economic forecasts that could be adopted to help people make judgements about the UK and regional economy.

One of these conditions was: “It would be important to communicate the uncertainties associated with any regional GVA forecasts. For example, there are deficiencies in historical GVA data. Forecasts will only be as good as the data they rely on”.

The people behind the Office for Statistics Regulation in 2020

This year I’ve written 9 blogs, ranging from an exploration of data gaps to a celebration of the armchair epidemiologists. I was thinking of making it to double figures, setting out my reflections across a tumultuous year. And describing my pride in what the Office for Statistics Regulation team has delivered. But, as so often in OSR, the team is way ahead of me. They’ve pulled together their own year-end reflections into a short summary. Their pride in their work, and their commitment to the public good of statistics, really say far more than anything I could write; it’s just a much better summary.

So here it is (merry Christmas)

Ed Humpherson

Donna Livesey – Business Manager

2020 has been a hard year for everyone, with many very personally affected by the pandemic. Moving from a bustling office environment to living and working home alone had the potential to make for a pretty lonely existence, but I’ve been very lucky.

This year has only confirmed what a special group of people I work with in OSR. Everyone has been working very hard but we have taken time to support each other, to continue to work collaboratively to find creative solutions to new challenges, and to generously share our lives, be it our families or our menagerie of pets, all be it virtually.

I am so proud to work with a team that have such a passion for ensuring the public get the statistics and data they need to make sense of the world around them, while showing empathy for the pressures producers of statistics are under at this time.

We all know that the public will continue to look to us beyond the pandemic, as the independent regulator, to ensure statistics honestly and transparently answer the important questions about the longer term impacts on all aspects of our lives, and our childrens’ lives. I know we are all ready for that challenge, as we are all ready for that day when we can all get together in person.


Caroline Jones – Statistics Regulator, Health and Social Care Lead

2020 started off under lockdown, with the nation gripped by the COVID-19 pandemic and avidly perusing the daily number of deaths, number of tests, volume of hospitalisations and number of vaccines. This level of anxiety has pushed more people into contacting OSR to ask for better statistics, and it has been a privilege to work at the vanguard of the improvement to the statistics.

To manage the workload, the Health domain met daily with Mary (Deputy Director for Regulation) and Katy, who manages our casework, so we could coordinate the volume of health related casework we were getting in. We felt it important to deal sympathetically with statistic producers, who have been under immense pressure this year, to ensure they changed their outputs to ensure they were producing the best statistics possible. It’s been rewarding to be part of that improvement and change, but we still have a lot of work to do in 2021 to continue to advocate for better social and community care statistics.


Leah Skinner – Digital Communications Officer

As a communications professional who loves words, I very often stop and wonder how I ended up working in an environment with so many numbers. But if 2020 has taught me anything, it’s that the communication of those numbers, in a way that the public can understand, is crucial to make sure that the public have trust in statistics.

This has made me reflect on my own work, and I am more determined than ever to make our work, complex as it can be, as accessible and as understandable to our audiences as possible. For me, the highlight of this year has been watching our audience grow as we have improved our Twitter outputs and launched our own website. I really enjoy seeing people who have never reached out to us before contacting us to work with us, whether it be to do with Voluntary Application of the Code, or to highlight casework.

As truly awful as 2020 has been, it is clear now that the public are far more aware of how statistics affect our everyday lives, and this empowers us to ask more questions about the quality and trustworthiness of data and hold organisations to account when the data isn’t good enough.


Mark Pont – Assessment Programme Lead

For me, through the challenges of 2020, it’s been great to see the OSR team show itself as a supportive regulator. Of course we’ve made some strong interventions where these have been needed to champion the public good of statistics and data. But much of our influence comes through the support and challenge we offer to statistics producers.

We published some of our findings in the form of rapid regulatory review letters. However, much of our support and challenge was behind the scenes, which is just as valuable.

During the early days of the pandemic we had uncountable chats with teams across the statistical system as they wrestled with how to generate the important insights that many of us needed. All this in the absence of the usual long-standing data sources and while protecting often restricted and vulnerable workforces who were adapting to new ways of working. It was fantastic to walk through those exciting developments with statistical producers, seeing first-hand the rapid exploitation of new data sources.

2021 will still be challenging for many of us. Hopefully many aspects of life will start to return to something closer to what we were used to. But I think the statistical system, including us as regulators, will start 2021 from a much higher base than 2020 and I look forward to seeing many more exciting developments in the world of official statistics.


Emily Carless – Statistics Regulator, Children, Education and Skills Lead

2020 has been a challenging year for producers and users of children, education and skills statistics which has had a life changing impact on the people who the statistics are about.  We started the year polishing the report of our review of post-16 education and skills statistics and are finishing it polishing the report of our review of the approach to developing the statistical models designed for awarding grades.  These statistical models had a profound impact on young people’s lives and on public confidence in statistics and statistical models.

As in other domains, statistics have needed to be developed quickly to meet the need for data on the impact of the pandemic on children and the education system, and to inform decisions such as those around re-opening schools. The demand for statistics in this area continues to grow to ensure that the impact of the pandemic on this generation can be fully understood.

How to solve a puzzle like productivity

This is a guest blog from Andy Haldane and Gavin Wallis, Industrial Strategy Council


There’s a reason that the UK’s weak productivity performance since the financial crisis is still referred to as a “puzzle”. Despite copious amounts of research, including some of our own, we still have a long list of candidate explanations. In part that’s because there is unlikely to be a simple single explanation. But it also reflects us not having good enough data to pin down the causes more precisely. And that makes designing a good policy response, such as an Industrial Strategy, harder – existing research tells us which horses we should back, but not which ones we should bet biggest on.

What is often referred to as “firm dynamics” is persistently found to be an important factor in the productivity puzzle. This basically means the process by which firms are created, grow and close. It is also affected by how the economy, and economic policy, supports this process – for example, by facilitating the reallocation of resources (capital, labour, ideas, innovation) from firms that have closed to those being created. Empirical estimates suggest reduced firm dynamics could account for something like a third of the productivity puzzle.

Getting a good handle on firm dynamics requires you to have good data on business demography – that is to say, a register that tracks firms being created, how they are changing over time and if or when they close.  To understand firms, we need to track their behaviour over the full lifecycle.  Indeed, good and timely data on business behaviour is more important than ever in the current climate, as we try to understand the impact of the COVID crisis on business output and employment in the near-term and productivity and innovation over the medium term.

But the benefits of good business demography data go way beyond regular monitoring of business dynamics. Firms fill in lots of different surveys and often the same survey in successive years.  Combining the insight from these surveys is only possible with good business demography data, enabling richer insights into different facets of firms’ behaviour.

Early UK work in this area focused on combining successive years of a single survey, the Census of Production. That was a lot harder to do than you might imagine, but yielded some important insights. For example, Griffith (1999) showed that foreign-owned plants in the UK car industry have a substantial labour productivity advantage over UK-owned plants.

This has since progressed into linking more than one dataset. For example, Rogers (2006) combined the survey that asks about R&D (Research and Development) spending (the BERD) with the Annual Respondents Database (ARD), which includes a firm-level measure of productivity. That work produced firm-level estimates of the rates of return to R&D for small and medium-size firms (SMEs). Those estimates were very high, suggesting SMEs were constrained in their R&D spending.

These two examples illustrate the usefulness of data-linking when understanding firm behaviour, productivity dynamics and possible policy approaches to raising it. Linking dataset opens up a huge range of analytical possibilities. But it’s fair to say there are often significant barriers to conducting such analysis, including data access and data usability. Good business demography data helps lower those barriers.

That is why we very much welcome the Office for Statistics Regulation review of the business demography statistics and support their recommendations to invest in making improvements to them. Developing a new and improved business register could open-up a wide range of analytical possibilities that could help support frontier research and improved policy making, including around issues of productivity and industrial strategy. This might not solve completely the productivity puzzle, but it would offer an invaluable trail of new clues.