Assessment Programme Lead, Mark Pont, discusses the importance of statisticians understanding and communicating uncertainty in their statistics, in light of our recent report exploring approaches to presenting uncertainty in the statistical system.

Uncertainty exists in so many aspects of life and taking it into account is an important part of the decisions we make.

I recently had a hernia repaired at a hospital about an hour from where I live. Ahead of admission the hospital gave me an approximate discharge time. I needed to make plans to get home, which revolved around whether it made sense for my wife to drop me off then spend the afternoon mooching around art galleries, parks and shops. So, I needed to understand how accurate the estimate was, and what factors and assumptions might affect its accuracy. It turned out that it depended on things like where in the order for that day’s surgery I ended up, and how the surgery and my immediate recovery went. All of this was useful intel for our planning.

Later (after the op) I needed a more accurate estimate. My wife was (as planned!) mooching around art galleries, parks and shops, and we needed to try to coordinate her getting back to the hospital close to my discharge so that neither of us was left waiting around too long.

Taking uncertainty into account is also necessary when using official statistics. People who make decisions based on statistics need to factor the uncertainties around the statistics into their decision making. It’s not great to develop policy based on an assumption about the accuracy of the statistics that turns out not to be true. Statistics are rarely, if ever, absolute facts. There will always be some inherent uncertainty ­– from partial data collection in samples, delays in updating administrative databases, and so on. And different users may want different information about uncertainty depending on the nature of the decisions they’re faced with making and their level of expertise.

Our first Insight project considers the way that uncertainty in official statistics is communicated. We found a mixed bag of practice.

There are many cases where uncertainty is presented in some form in statistical bulletins – in the narrative, charts and infographics. Good examples include using words like “estimate” within the narrative, inclusion of error bounds in charts and clear lists of ways that the statistics can and can’t be used. Projections too often include variants, which gives a neat way of showing that the effect of different assumptions.

There are occasions though where estimates are presented as though they are absolute facts. Not acknowledging that uncertainty could exist within them could lead users to false conclusions. There are also times where better descriptions are needed to help users take uncertainty and its effects into account appropriately. Phrases like “care needs to be taken” and “caution is needed” are widely used, but they could be more specifically helpful in guiding appropriate use of the statistics.

We also found that the communication of uncertainty in detailed data tables (particularly where they are user-specified) is less well-developed, not least because describing uncertainty succinctly in these situations isn’t easy.

There is, however, an abundance of guidance and support available for analysts to help them think through uncertainty and how best to present it to users. We at OSR will continue to help improve and socialise that guidance. We will also develop our regulatory work to understand more about what effective communication of uncertainty looks like, and to encourage the spreading of good practice across government data outputs. We especially expect to develop our thinking on how the change in quality and uncertainty over time can be most helpfully communicated.

And for those who have made it this far, the surgery went well, I’m fully recovered and my wife enjoyed her afternoon out.


Related links:

Approaches to presenting uncertainty in the statistical system