The following blog was originally published by ESCoE, the Economic Statistics Centre of Excellence, following a panel session at the ESCoE 2024 conference, led by session panellists Marianthi Dunn, (then of the Office for Statistics Regulation), Sumit Dey-Chowdhury (Office for National Statistics), Johnny Runge (King’s College London) and chair Ed Humpherson (Office for Statistics Regulation)

Most important statistics are actually an “estimate” – a statistical judgement that is subject to some degree of uncertainty. So how can we best communicate that uncertainty while still maintaining trust in the statistics themselves?

Following criticism of notable UK GDP revisions in 2023 and resulting Office for Statistics Regulation Review of the Office for National Statistic’s (ONS’s) GDP approach, a panel session at ESCoE’s 2024 conference explored communicating this uncertainty.

Session panellists Marianthi Dunn, (then of the Office for Statistics Regulation), Sumit Dey-Chowdhury (Office for National Statistics), Johnny Runge (King’s College London) and chair Ed Humpherson (Office for Statistics Regulation) reflect on the session and the challenge of moving from public criticism to confidence.

No-one should have been surprised – or upset – by the reaction to the revisions Chris Giles (Financial Times)

“The revisions in 2023 were historically large: a cumulative 1.8% increase in the volume of output relative to the previous statistics is significant in any context. Moreover, in the context of UK trend growth of little over 1%, it represents almost two years of economic progress.

Furthermore, the level of UK GDP relative to other countries was a key measure of UK performance. The ONS used the baseline of the pre-pandemic level extensively as a headline measure. And in economic debate, international comparisons of this measure (published also by ONS regularly) became the gauge of government’s success for some commentators. At no time did ONS or the UK Statistics Authority (UKSA) suggest such comparisons were inappropriate.

So, no one at ONS or UKSA should have been surprised at the political and press reaction to the figures. Policy makers had to change thinking on fiscal and monetary policy, and the media were understandably interested. And there is little public understanding of GDP, let alone its measurement.

To make matters worse, the ONS release of Blue Book figures on 1 September was poorly written, which compounded issues around the narrative and public understanding. The main change to GDP was buried, and the headline was buried beneath secondary information. There was also little discussion of significant sectoral changes, such as the change in the contribution of Steel.

Turning to the media coverage, I felt that most of it was reasonable and not excessive, with a focus on the factual reporting of the change. There were some more pejorative pieces, but they seemed justified given the scale of the change.”

But there are lessons to learn (Marianthi Dunn, Office for Statistics Regulation (OSR))

“Drawing on the OSR report into GDP revisions, it is important to recognise the challenges of measuring GDP. These challenges occur even during periods of normal economic growth, and are compounded in times of rapid economic change, like the pandemic. And it is during times like these when reliable and timely statistics are needed the most. This is why international guidance recommends using three approaches to measurement of GDP: production, expenditure, and income. It is also important to note that this experience of revisions wasn’t unique to the UK. Most countries encountered similar challenges in estimating GDP during COVID-19.

GDP is a continuously evolving estimate, updated as new data become available. Revisions are normal, unavoidable and an inherent part of measuring the complexity of contemporary economies. They should not be seen as errors or corrections, much less as blunders or failures.

Despite these challenges, learnings from 2023 prompt three key questions for producers of economic statistics:

  1. “How is the uncertainty which underlies the GDP estimates as they evolve, communicated?” The aim is to use sufficient qualitative and quantitative analysis to enhance public understanding of uncertainty, without reducing the trust in these estimates.
  2. “How do the different users interpret the statistics, given their uncertainty?” During times of significant cyclical and structural changes, there is greater interest in these high-profile statistics. It is also important to improve access to, and usability of, explanatory information on GDP revisions.
  3. “How do revisions reflect the main economic story?” The focus should be on improving access to data. During periods of significant economic change, there is a greater need to estimate GDP more precisely, using signals from different parts of the economy. The aim is to think about what data sources and methods can be used to reduce reliance on the more traditional assumptions of the production approach.


Communicating uncertainty is not straightforward (Sumit Dey-Chowdhury, ONS)

“There will always be uncertainty about the future, and the public may have a good degree of sympathy with this. However, the public may not typically understand that there is also uncertainty about the past. This is one of the main challenges for National Statistical Institutes like the ONS in communicating uncertainty.

A common factor in this uncertainty is the trade-off between timeliness and accuracy. Timely estimates of GDP are based on incomplete data; more accurate estimates are possible once more data is available, but they are not as timely. In essence, this is the conceptual problem that leads to revisions.

While this conceptual problem is well-recognised, there are barriers to public understanding. The first is that the idea of an early estimate of GDP, which is iterative and updated over time. This may clash with people’s perception of GDP as a fixed measurement of the economy.

Second, uncertainty can relate to very different economic circumstances. We can see this when comparing the Global Financial Crisis; COVID-19; and the Cost-of-Living crisis. The uncertainty of the Global Financial Crisis involved impact of asset valuations, liquidity crunches and solvency. The uncertainty of the pandemic was largely related to a huge shift in the nature and location of work. And in the context of the cost of living crisis, the nature of the uncertainty refers to capturing how individual households and individual firms are responding to these large price changes, including in how we produce real or volume estimates of GDP.

In this context, I believe the OSR’s recommendations could go further. Of course, there is value in exploring how we can improve communications (recommendation 2 of the Review). However, practical experiences of communicating uncertainty suggest that it is difficult. Additionally, the recent Bernanke Review highlighted the challenges of the Bank of England’s use of fan charts to convey uncertainty. This also has implications for official statistics.

Undoubtably the most important characteristic for a statistics producer is transparency: transparency about the data sources; transparency about the methods; and transparency about what the data does and doesn’t mean. The ONS already publishes information on the revision performance of early estimates of GDP but could do more to enhance transparency.”


What is a reasonable aim for us to have when communicating uncertainty? (Johnny Runge, King’s Policy Institute)

“While I see strong reasons to communicate uncertainty to policymakers and other expert audiences, I am conflicted on communicating uncertainty to a public, non-technical audience, at least before that uncertainty comes to fruition and changes the economic narrative.

Evidence for communicating uncertainty:

On one hand, I can see some clear potential advantages to communication of uncertainty. There is support from online experiments on communicating data uncertainty when publishing economic estimates, including papers by David Spiegelhalter (on unemployment), Galvao and Mitchell (on GDP), and Galvao (on productivity). The two Galvao papers were ESCoE research projects.

All three papers find that communicating uncertainty can lead the public to (rightly) perceive greater uncertainty in the numbers. Moreover, the papers find no evidence to suggest that conveying uncertainty undermines trust in the numbers and the source, though it does not increase trust either.

 

Evidence against communicating uncertainty:

  • My research on public understanding of economic statistics recommends communicating in a simpler and more engaging way and focusing on what is relevant and interesting for the public: in, short, to go back to basics. Communicating uncertainty does not feel like going back to basics.
  • Online experiments are not real life. In real life, uncertainty communication may be picked up by journalists and politicians. These actors may cherry pick figures from the uncertainty message. For example, both higher and lower figures, respectively, could be used to paint very different pictures of the economy. Indeed, our research generally shows people are most sceptical about statistics when they perceive that the figures can be manipulated to say to fit a specific narrative. And emphasising uncertainty may create exactly this perception in the minds of some members of the public.
  • To investigate this further, I undertook 20 qualitative interviews. This pointed to significant nuance behind online experiment findings. There are many different interpretations, including understandings/misunderstandings of communication tools. On GDP, people did not care enough to have strong views on uncertainty. On the other hand, people were often very surprised about the levels of uncertainty around employment.

So, what’s the way forward? I support the OSR’s recommendation for the ONS to take charge of this narrative. But I am sceptical about offering the communication of uncertainty to the wider public as a remedy to the challenges faced by statistics producers like ONS.”

 

Towards common ground (Ed Humpherson, Office for Statistics Regulation)

“The views within this panel session did not point to a single, clear consensus, reflecting the challenges of this complex topic.

Uncertainty is inherent to the measurement of the economy and developing ways of communicating it is an important area of focus. This cuts in at two levels: first, at the basic level of how individual statistics releases are explained. This involves ensuring that major shifts in the economic story are recognised, and that odd patterns in the data are identified and explained.

Second, statistics producers must ensure that the inherent uncertainty of GDP estimates is always conveyed, recognising that revisions may occur. This may require ongoing experimentation to identify the best communications approaches.

Even with the best communications approaches, there will still be problems. At heart, if the economic story changes, then this will always be big news and may affect public trust in economic measurement.”

 

ESCoE’s 2024 Conference on Economic Measurement took place on 15-17 May at Alliance Manchester Business School. The conference focuses on recent research advances in economic measurement and statistics. Slides and recordings from sessions (where available) are now ready to watch on the ESCoE website.

ESCoE blogs are published to further debate. Any views expressed are solelythose of the author(s) and so cannot be taken to represent those of the ESCoE, itspartner institutions or the Office for National Statistics.