Revising GDP: The challenge of uncertainty

The following blog was originally published by ESCoE, the Economic Statistics Centre of Excellence, following a panel session at the ESCoE 2024 conference, led by session panellists Marianthi Dunn, (then of the Office for Statistics Regulation), Sumit Dey-Chowdhury (Office for National Statistics), Johnny Runge (King’s College London) and chair Ed Humpherson (Office for Statistics Regulation)

Most important statistics are actually an “estimate” – a statistical judgement that is subject to some degree of uncertainty. So how can we best communicate that uncertainty while still maintaining trust in the statistics themselves?

Following criticism of notable UK GDP revisions in 2023 and resulting Office for Statistics Regulation Review of the Office for National Statistic’s (ONS’s) GDP approach, a panel session at ESCoE’s 2024 conference explored communicating this uncertainty.

Session panellists Marianthi Dunn, (then of the Office for Statistics Regulation), Sumit Dey-Chowdhury (Office for National Statistics), Johnny Runge (King’s College London) and chair Ed Humpherson (Office for Statistics Regulation) reflect on the session and the challenge of moving from public criticism to confidence.

No-one should have been surprised – or upset – by the reaction to the revisions Chris Giles (Financial Times)

“The revisions in 2023 were historically large: a cumulative 1.8% increase in the volume of output relative to the previous statistics is significant in any context. Moreover, in the context of UK trend growth of little over 1%, it represents almost two years of economic progress.

Furthermore, the level of UK GDP relative to other countries was a key measure of UK performance. The ONS used the baseline of the pre-pandemic level extensively as a headline measure. And in economic debate, international comparisons of this measure (published also by ONS regularly) became the gauge of government’s success for some commentators. At no time did ONS or the UK Statistics Authority (UKSA) suggest such comparisons were inappropriate.

So, no one at ONS or UKSA should have been surprised at the political and press reaction to the figures. Policy makers had to change thinking on fiscal and monetary policy, and the media were understandably interested. And there is little public understanding of GDP, let alone its measurement.

To make matters worse, the ONS release of Blue Book figures on 1 September was poorly written, which compounded issues around the narrative and public understanding. The main change to GDP was buried, and the headline was buried beneath secondary information. There was also little discussion of significant sectoral changes, such as the change in the contribution of Steel.

Turning to the media coverage, I felt that most of it was reasonable and not excessive, with a focus on the factual reporting of the change. There were some more pejorative pieces, but they seemed justified given the scale of the change.”

But there are lessons to learn (Marianthi Dunn, Office for Statistics Regulation (OSR))

“Drawing on the OSR report into GDP revisions, it is important to recognise the challenges of measuring GDP. These challenges occur even during periods of normal economic growth, and are compounded in times of rapid economic change, like the pandemic. And it is during times like these when reliable and timely statistics are needed the most. This is why international guidance recommends using three approaches to measurement of GDP: production, expenditure, and income. It is also important to note that this experience of revisions wasn’t unique to the UK. Most countries encountered similar challenges in estimating GDP during COVID-19.

GDP is a continuously evolving estimate, updated as new data become available. Revisions are normal, unavoidable and an inherent part of measuring the complexity of contemporary economies. They should not be seen as errors or corrections, much less as blunders or failures.

Despite these challenges, learnings from 2023 prompt three key questions for producers of economic statistics:

  1. “How is the uncertainty which underlies the GDP estimates as they evolve, communicated?” The aim is to use sufficient qualitative and quantitative analysis to enhance public understanding of uncertainty, without reducing the trust in these estimates.
  2. “How do the different users interpret the statistics, given their uncertainty?” During times of significant cyclical and structural changes, there is greater interest in these high-profile statistics. It is also important to improve access to, and usability of, explanatory information on GDP revisions.
  3. “How do revisions reflect the main economic story?” The focus should be on improving access to data. During periods of significant economic change, there is a greater need to estimate GDP more precisely, using signals from different parts of the economy. The aim is to think about what data sources and methods can be used to reduce reliance on the more traditional assumptions of the production approach.


Communicating uncertainty is not straightforward (Sumit Dey-Chowdhury, ONS)

“There will always be uncertainty about the future, and the public may have a good degree of sympathy with this. However, the public may not typically understand that there is also uncertainty about the past. This is one of the main challenges for National Statistical Institutes like the ONS in communicating uncertainty.

A common factor in this uncertainty is the trade-off between timeliness and accuracy. Timely estimates of GDP are based on incomplete data; more accurate estimates are possible once more data is available, but they are not as timely. In essence, this is the conceptual problem that leads to revisions.

While this conceptual problem is well-recognised, there are barriers to public understanding. The first is that the idea of an early estimate of GDP, which is iterative and updated over time. This may clash with people’s perception of GDP as a fixed measurement of the economy.

Second, uncertainty can relate to very different economic circumstances. We can see this when comparing the Global Financial Crisis; COVID-19; and the Cost-of-Living crisis. The uncertainty of the Global Financial Crisis involved impact of asset valuations, liquidity crunches and solvency. The uncertainty of the pandemic was largely related to a huge shift in the nature and location of work. And in the context of the cost of living crisis, the nature of the uncertainty refers to capturing how individual households and individual firms are responding to these large price changes, including in how we produce real or volume estimates of GDP.

In this context, I believe the OSR’s recommendations could go further. Of course, there is value in exploring how we can improve communications (recommendation 2 of the Review). However, practical experiences of communicating uncertainty suggest that it is difficult. Additionally, the recent Bernanke Review highlighted the challenges of the Bank of England’s use of fan charts to convey uncertainty. This also has implications for official statistics.

Undoubtably the most important characteristic for a statistics producer is transparency: transparency about the data sources; transparency about the methods; and transparency about what the data does and doesn’t mean. The ONS already publishes information on the revision performance of early estimates of GDP but could do more to enhance transparency.”


What is a reasonable aim for us to have when communicating uncertainty? (Johnny Runge, King’s Policy Institute)

“While I see strong reasons to communicate uncertainty to policymakers and other expert audiences, I am conflicted on communicating uncertainty to a public, non-technical audience, at least before that uncertainty comes to fruition and changes the economic narrative.

Evidence for communicating uncertainty:

On one hand, I can see some clear potential advantages to communication of uncertainty. There is support from online experiments on communicating data uncertainty when publishing economic estimates, including papers by David Spiegelhalter (on unemployment), Galvao and Mitchell (on GDP), and Galvao (on productivity). The two Galvao papers were ESCoE research projects.

All three papers find that communicating uncertainty can lead the public to (rightly) perceive greater uncertainty in the numbers. Moreover, the papers find no evidence to suggest that conveying uncertainty undermines trust in the numbers and the source, though it does not increase trust either.

 

Evidence against communicating uncertainty:

  • My research on public understanding of economic statistics recommends communicating in a simpler and more engaging way and focusing on what is relevant and interesting for the public: in, short, to go back to basics. Communicating uncertainty does not feel like going back to basics.
  • Online experiments are not real life. In real life, uncertainty communication may be picked up by journalists and politicians. These actors may cherry pick figures from the uncertainty message. For example, both higher and lower figures, respectively, could be used to paint very different pictures of the economy. Indeed, our research generally shows people are most sceptical about statistics when they perceive that the figures can be manipulated to say to fit a specific narrative. And emphasising uncertainty may create exactly this perception in the minds of some members of the public.
  • To investigate this further, I undertook 20 qualitative interviews. This pointed to significant nuance behind online experiment findings. There are many different interpretations, including understandings/misunderstandings of communication tools. On GDP, people did not care enough to have strong views on uncertainty. On the other hand, people were often very surprised about the levels of uncertainty around employment.

So, what’s the way forward? I support the OSR’s recommendation for the ONS to take charge of this narrative. But I am sceptical about offering the communication of uncertainty to the wider public as a remedy to the challenges faced by statistics producers like ONS.”

 

Towards common ground (Ed Humpherson, Office for Statistics Regulation)

“The views within this panel session did not point to a single, clear consensus, reflecting the challenges of this complex topic.

Uncertainty is inherent to the measurement of the economy and developing ways of communicating it is an important area of focus. This cuts in at two levels: first, at the basic level of how individual statistics releases are explained. This involves ensuring that major shifts in the economic story are recognised, and that odd patterns in the data are identified and explained.

Second, statistics producers must ensure that the inherent uncertainty of GDP estimates is always conveyed, recognising that revisions may occur. This may require ongoing experimentation to identify the best communications approaches.

Even with the best communications approaches, there will still be problems. At heart, if the economic story changes, then this will always be big news and may affect public trust in economic measurement.”

 

ESCoE’s 2024 Conference on Economic Measurement took place on 15-17 May at Alliance Manchester Business School. The conference focuses on recent research advances in economic measurement and statistics. Slides and recordings from sessions (where available) are now ready to watch on the ESCoE website.

ESCoE blogs are published to further debate. Any views expressed are solelythose of the author(s) and so cannot be taken to represent those of the ESCoE, itspartner institutions or the Office for National Statistics.

 

 

 

 

 

 

 

 

 

The people behind the Office for Statistics Regulation in 2020

This year I’ve written 9 blogs, ranging from an exploration of data gaps to a celebration of the armchair epidemiologists. I was thinking of making it to double figures, setting out my reflections across a tumultuous year. And describing my pride in what the Office for Statistics Regulation team has delivered. But, as so often in OSR, the team is way ahead of me. They’ve pulled together their own year-end reflections into a short summary. Their pride in their work, and their commitment to the public good of statistics, really say far more than anything I could write; it’s just a much better summary.

So here it is (merry Christmas)

Ed Humpherson

Donna Livesey – Business Manager

2020 has been a hard year for everyone, with many very personally affected by the pandemic. Moving from a bustling office environment to living and working home alone had the potential to make for a pretty lonely existence, but I’ve been very lucky.

This year has only confirmed what a special group of people I work with in OSR. Everyone has been working very hard but we have taken time to support each other, to continue to work collaboratively to find creative solutions to new challenges, and to generously share our lives, be it our families or our menagerie of pets, all be it virtually.

I am so proud to work with a team that have such a passion for ensuring the public get the statistics and data they need to make sense of the world around them, while showing empathy for the pressures producers of statistics are under at this time.

We all know that the public will continue to look to us beyond the pandemic, as the independent regulator, to ensure statistics honestly and transparently answer the important questions about the longer term impacts on all aspects of our lives, and our childrens’ lives. I know we are all ready for that challenge, as we are all ready for that day when we can all get together in person.

 

Caroline Jones – Statistics Regulator, Health and Social Care Lead

2020 started off under lockdown, with the nation gripped by the COVID-19 pandemic and avidly perusing the daily number of deaths, number of tests, volume of hospitalisations and number of vaccines. This level of anxiety has pushed more people into contacting OSR to ask for better statistics, and it has been a privilege to work at the vanguard of the improvement to the statistics.

To manage the workload, the Health domain met daily with Mary (Deputy Director for Regulation) and Katy, who manages our casework, so we could coordinate the volume of health related casework we were getting in. We felt it important to deal sympathetically with statistic producers, who have been under immense pressure this year, to ensure they changed their outputs to ensure they were producing the best statistics possible. It’s been rewarding to be part of that improvement and change, but we still have a lot of work to do in 2021 to continue to advocate for better social and community care statistics.

 

Leah Skinner – Digital Communications Officer

As a communications professional who loves words, I very often stop and wonder how I ended up working in an environment with so many numbers. But if 2020 has taught me anything, it’s that the communication of those numbers, in a way that the public can understand, is crucial to make sure that the public have trust in statistics.

This has made me reflect on my own work, and I am more determined than ever to make our work, complex as it can be, as accessible and as understandable to our audiences as possible. For me, the highlight of this year has been watching our audience grow as we have improved our Twitter outputs and launched our own website. I really enjoy seeing people who have never reached out to us before contacting us to work with us, whether it be to do with Voluntary Application of the Code, or to highlight casework.

As truly awful as 2020 has been, it is clear now that the public are far more aware of how statistics affect our everyday lives, and this empowers us to ask more questions about the quality and trustworthiness of data and hold organisations to account when the data isn’t good enough.

 

Mark Pont – Assessment Programme Lead

For me, through the challenges of 2020, it’s been great to see the OSR team show itself as a supportive regulator. Of course we’ve made some strong interventions where these have been needed to champion the public good of statistics and data. But much of our influence comes through the support and challenge we offer to statistics producers.

We published some of our findings in the form of rapid regulatory review letters. However, much of our support and challenge was behind the scenes, which is just as valuable.

During the early days of the pandemic we had uncountable chats with teams across the statistical system as they wrestled with how to generate the important insights that many of us needed. All this in the absence of the usual long-standing data sources and while protecting often restricted and vulnerable workforces who were adapting to new ways of working. It was fantastic to walk through those exciting developments with statistical producers, seeing first-hand the rapid exploitation of new data sources.

2021 will still be challenging for many of us. Hopefully many aspects of life will start to return to something closer to what we were used to. But I think the statistical system, including us as regulators, will start 2021 from a much higher base than 2020 and I look forward to seeing many more exciting developments in the world of official statistics.

 

Emily Carless – Statistics Regulator, Children, Education and Skills Lead

2020 has been a challenging year for producers and users of children, education and skills statistics which has had a life changing impact on the people who the statistics are about.  We started the year polishing the report of our review of post-16 education and skills statistics and are finishing it polishing the report of our review of the approach to developing the statistical models designed for awarding grades.  These statistical models had a profound impact on young people’s lives and on public confidence in statistics and statistical models.

As in other domains, statistics have needed to be developed quickly to meet the need for data on the impact of the pandemic on children and the education system, and to inform decisions such as those around re-opening schools. The demand for statistics in this area continues to grow to ensure that the impact of the pandemic on this generation can be fully understood.