This publication details the underlying research behind our think piece – Statistical literacy: It’s all in the communication
The ability to understand and evaluate statistical information is crucial to navigate everyday life and the consequences of poor statistical comprehension are wide-ranging.
This report presents the findings from a literature review commissioned by the Office for Statistics Regulation (OSR) to establish the current landscape of statistical literacy research. This review provides a broad overview of the literature around statistical literacy and is not designed to be systematic or exhaustive.
About the Author
This research review was written by Jessica McMaster, a third-year PhD student studying Psychology at the University of Cambridge. The project was commissioned by the Office for Statistics Regulation (OSR) as part of the Cambridge Grand Challenges (CGC), which is a framework for collaboration between industry, government, and academia, designed to help deliver the UK’s Industrial and Innovation Strategy.
Defining Statistical Literacy
The exact definition of the term statistical literacy is unclear, which may cause difficulties in both establishing the current levels within the general public and implementing strategies to improve it.
We found no consensus across definitions of statistical literacy, though there were some commonly used components, including foundational abilities, knowledge of statistical concepts, and critical thinking.
Differences between definitions are driven by contextual factors, and a definition broad enough to apply to all contexts may lose utility. An alternative approach for defining statistical literacy may be to consider first the context where the definition will be applied and then to specify the components required for statistical literacy in that context.
The majority of the evidence we identified related to the knowledge components linked to statistical literacy (e.g. literacy and numeracy) rather than statistical literacy directly.
We observed great variability amongst the general public in the skills linked to statistical literacy. Studies often found that skill level was influenced substantially by demographic factors such as age, gender, and education. More recent evidence is needed and we found no evidence that aimed to capture the multidimensional nature of statistical literacy.
We identified multiple recommendations on how to best communicate statistics to non-specialist audiences. The evidence for these recommendations varied from smaller-scale research studies to large-scale surveys, and recommendations from statistical bodies. The evidence base should be considered when deciding whether to apply these recommendations to statistical communication, with more weight given to recommendations when the study sample matches the target audience of the communication and when a finding has been replicated across multiple studies.
The examples of relevant OSR correspondence were largely aligned with the findings of the review although relevant correspondence could not be found for all topics.
We identified a variety of initiatives in the statistical literacy space with different locations, target audiences and overarching aims. Many of the programmes may no longer be active and it may be beneficial to gain further information about these in particular. None of the work identified during the review aimed to connect these different initiatives. Therefore, sharing learnings across these groups may be a fruitful future endeavour.
This research was commissioned to shape OSR’s future work on statistical literacy. The findings from the review provide an evidence base that will support OSR in developing its public position on statistical literacy and guide future regulatory work.
Back to top