Annex B: What we did
The aim of this project is to build our understanding of how OSR can support statistics producers to improve the communication of uncertainty within their outputs. It is not our role to prescribe how statistics producers should do this but to highlight good practice where we find it and common areas of challenge. The analysis underpinning this insight project was structured in three parts:
- What guidance on communicating uncertainty is available to statistical producers
- What recommendations has OSR made around uncertainty through casework interventions and regulatory reviews
- What approach are statistical producers taking to describing and presenting uncertainty within their statistics
The Winton Centre for Risk and Evidence Communication Centre has done a lot of great work around communicating uncertainty including its seminal paper on the topic. We reviewed the work that the Winton Centre did and used an adapted form of the Winton Centre’s uncertainty framework: our approach is summarised in Annex A. We applied this framework when carrying out our compliance checks as part of our work for this review. This gave us a pool of evidence about how uncertainty of official statistics is communicated. As part of our ongoing regulatory work, we are now formally recording the way that uncertainty of official statistics is communicated.
As part of this current project, we reviewed the existing guidance that focusses on Communicating Quality, Uncertainty and Change guidance for the GSS. This guidance aims to support producers to be able to provide assurance to users on these three areas, explaining complex concepts whilst being clear and transparent about professional judgements. We also explored what other guidance is available in the Government Analysis Function more widely and found the Uncertainty Toolkit website, which is an analyst’s guide to dealing with uncertainty which forms part of the Aqua Book resources, to be a useful resource.
In order to analyse what recommendations OSR has previously made concerning uncertainty, we used a combination of web-scraping and database interrogation techniques to search for specific terms related to uncertainty. The web-scraping was carried out on OSR’s website to review all published correspondence containing these search terms. We then searched our internal casework database for these terms within emails from correspondents and our responses to them. Due to the potential overlap in published correspondence and responses to casework in the database, a manual review of flagged cases was carried out to remove any duplications. Another limitation of our analysis is the selection of search terms as they may have missed relevant correspondence and publications.
For the final area of analysis, we organised an OSR-wide session to review a range of statistical publications for their presentation of uncertainty. Members of OSR were asked to review the statistics, including the bulletin, tables, methodology documents and any other related outputs, for the presentation of uncertainty against the Winton Centre’s scoring criteria highlighted in the introduction. 80 sets of National Statistics were chosen at random for review. As we did not conduct a formal sampling approach in our selection of statistics to review, our findings are only indicative and may not fully reflect a producer or the statistical system’s approach to communicating uncertainty.
Using the outcomes from these three parts of analysis, the project team brought the findings together to identify common themes in the approach taken to communicating uncertainty within the statistical system. The project team then discussed the findings with DQHub and Winton Centre to help form recommendations for next steps.Back to top