Knowledge

There is limited evidence regarding the current level of statistical knowledge in the adult general population. As identified in the previous section, statistical literacy is also defined by several foundational abilities for which there is more evidence. In particular, this includes literacy and numeracy skills which have been assessed within multiple large-scale surveys of adult skills. This section summarises evidence on how researchers have attempted to measure components related to statistical literacy. This covers foundational skills, knowledge of statistical concepts, contextual knowledge, as well as revisiting the statistical literacy measures from the previous section.

3.1 Foundational Skills

The Skills for Life Survey (SfL2011), commissioned in 2011 by the Department for Business, Innovation and Skills, evaluated the basic skill level of adults aged 16-65 years in England (Department for Business Innovation & Skills, 2012). The SfL2011 respondents were recruited to be representative of 16–65-year-olds in England based on the proportion within each age band, ethnic group, and gender as well as the relative number who were disabled, employed or out of the labour market.

The skills assessed in SfL2011 included literacy, numeracy, and ICT skills. Each of these skills were pinpointed in the previous section as being foundational to achieving statistical literacy.  Numeracy performance was assessed in 5,823 individuals and a skill level was subsequently assigned ranging from “Entry level 1 or below” to “Level 2+”. Of those surveyed, 49.1% were Entry level 3 or below for numeracy. Entry level 3 is the national school curriculum equivalent for attainment at ages 9-11. Performance was higher overall in literacy with just 14.9% of the 5,824 adults surveyed scoring at Entry level 3 or below. Three practical skill areas of ICT were also assessed in over 2,220 respondents, these were word processing, email, and spreadsheets. More respondents were scored at the lower levels in spreadsheets with 66% scored at entry level 3 or below, in comparison 59% were scored at entry level 3 or below for word processing and 40% for emailing.

Individual-level variation in skill level could partly be explained based on demographic factors. For example, older adults performed considerably poorer than younger adults in ICT. Furthermore, females performed worse overall in numeracy than males. Other influential factors included first language, socioeconomic status, health, and ethnicity. The survey analysis also linked assessment performance to everyday life activities. For example, results indicated that responders that rated their numeracy performance as weak reported avoiding checking their bills and bank statements. Furthermore, they found that full-time workers had better numeracy than part-time workers.

The sample size for this survey was large and was designed to be representative of the population in England. The literacy and numeracy assessments had also been carried out in 2003 allowing observation of skill changes across that period at the group level. Overall, the results indicated a widely varied level of adult skills within the general population with individual differences partly driven by demographic factors. A notable proportion of respondents scored below Entry level 3 in numeracy which is likely to impact their ability to engage with statistical information in their everyday lives.

Foundational abilities were also assessed in the Programme for International Assessment of Adult Competencies (PIAAC). This is an international survey carried out every ten years in over 40 countries and involves interviewing 5,000 adults, aged 16-65 years, in each of the participating countries (Department for Business, Innovation & Skills., 2013). The skills assessed are literacy, numeracy and problem solving within technology-rich environments. The survey has had two cycles of data collection so far, with the results of the second cycle due to be published in 2024. Results from the first cycle of the survey, published in 2012, demonstrated that a substantial percentage of surveyed adults in most countries demonstrated the lowest levels of numeracy (8–32%) and literacy (5-28%) skills. In many of the countries surveyed, a high proportion of participants had low experience with ICT. Only 5.1% of adults, on average across countries, reached the highest skill level for problem-solving in technology-rich environments.

The mean proficiency score of just England-based respondents in literacy was not significantly different from the average of the other participating countries, whereas the proficiency score for numeracy was significantly below average. Around 10% of respondents in England reported a lack of experience or a lack of basic skills with computers. Consistent with the results of SfL2011, there is great variability in skill levels within the general population amongst the foundational skills required to achieve statistical literacy. This highlights the importance of tailoring statistical communication to different skill levels to reach a wider audience. This is discussed further in later sections of the report.

More recent evidence on numeracy skills in the general population was collected by the Financial Conduct Authority in their 2020 Financial Lives Survey (Financial Conduct Authority, 2021). This nationally representative survey is conducted approximately every two years, with the results of the first wave published in 2017. The 2020 survey involved around 16,000 interviews and as part of investigations into factors related to consumer vulnerability, data on numeracy skills related to financial concepts were collected. The results indicated that 34% of participants had poor or low levels of numeracy in this area. This assessment was based on just three multiple choice questions (Figure 2) with low numeracy indicated by answering none or one question correctly. Yet this survey does provide up-to-date evidence that the numeracy skills required as a member of society may be lacking within the general population causing quite widespread consumer vulnerability. Action to improve communication directed to consumers with varying skill levels has recently been carried out by the Plain Numbers Project (2021) which is discussed further in the “Action” section of this report.

Numeracy related to financial concepts questions:

Num1. Suppose you put £100 into a savings account with a guaranteed interest rate of 2% per year. There are no fees or tax to pay. You don’t make any further payments into this account and you don’t withdraw any money. How much would be in the account at the end of the first year, once the interest payment is made?

Num2. And how much would be in the account at the end of five years (remembering that there are no fees or deducations)?

  1. More than £110;
  2. Exactly £110;
  3. Less than £110;
  4. It is impossible to tell from the information given;
  5. Don’t know

Num3. If the inflation rate is 5% and the interest rate you get on your savings is 3%, will your savings have more, less or the same amount of buying power in a year’s time?

  1. More;
  2. The same;
  3. Less;
  4. Don’t know

Figure 2. Numeracy questions used in the 2020 Financial Lives Survey (Financial Conduct Authority, 2021)

Research on numeracy was also conducted by Ipsos MORI, in partnership with National Numeracy and the Policy Institute at King’s College London to mark National Numeracy Day in 2019 (National Numeracy, 2019). This involved an online survey, completed by 2,007 adults aged 16-75 years in the UK. The survey included five multiple choice numeracy questions designed to be roughly equivalent to a GCSE maths paper. The data from this survey was weighted to reflect the national population profile. The results indicated that 20% of the population scored 4 or 5 of the questions correctly which was described as roughly equivalent to a GCSE pass grade. They state that these results are in line with the numeracy findings from the aforementioned SfL2011 (Department for Business Innovation & Skills, 2012). Those in older age groups also tended to score higher than younger people.

3.2 Statistical Knowledge

One survey which did incorporate items around statistical information is the OECD’s Programme for International Student Assessment (PISA). This survey has had multiple cycles, the first in 2000. The 2012 cycle involved data from around 510,000 students aged 15-16 years, across 65 countries, and is the most recent cycle to include data on statistical knowledge in the United Kingdom. The results indicated that the United Kingdom performed around the average in mathematics compared with the other participating countries (OECD, 2012).

In particular, the assessment area “uncertainty and data” tests knowledge of statistics and probability. Specifically, this content category is described as capturing the ability to detect and summarise messages within data as well as understanding the impact of variability.  Of the four mathematics-based content categories that were assessed, including “uncertainty and data”, “space and shape”, “quantity” and “change and relationships”, the mean performance of respondents in England was highest for “uncertainty and data”. It was concluded that pupils in England had a relatively strong performance on the assessment questions related to probability in statistics. In the wider United Kingdom, all countries perform higher on the “uncertainty and data” subscale than their overall mathematics score. Wales’ scores on all four mathematical subscales were lower than the other UK countries. Overall, although this survey tested pupils aged 15-16 years it could be argued that this is representative of a large proportion of adults’ statistical knowledge as many do not complete further mathematics-based education beyond this point; although, this review does not consider evidence around the retention of knowledge over time. OSR evaluated reports of PISA data against the Code of Practice for Statistics in 2021. The findings involved several recommendations predominantly around including more information about potential limitations and sources of bias in the data. It is expected that this will be addressed in future reports of PISA findings.

Public understanding of statistics was also captured in a 2020 report published by the Economic Statistics Centre of Excellence (ESCoE) titled “Public Understanding of Economics and Economic Statistics”. As mentioned in the title, this survey focused on economic statistics in particular. A series of research involving 12 focus groups and an online survey of UK adults explored understanding in areas such as inflation, unemployment, and gross domestic product (GDP) (Runge & Hudson, 2020). Overall, the results indicated that public understanding is greatest for areas where personal relevance is perceived to be higher such as inflation and interest rates, rather than GDP. Analysis of differences in results due to demographic factors suggested that older male participants of higher educational attainment and socioeconomic status were more knowledgeable, confident, and interested in economic statistics. Economists highlighted the importance of engaging with the public to gain insight that would improve their methods of communication. This was addressed in a follow-up study which is discussed in the next section of the report (Runge & Killick, 2021).

The report outlining this work also included a literature review around the subject area of public understanding of economic statistics. The review made the distinction between ‘top-down’ approaches to studying public understanding, whereby understanding of economic concepts is tested using the definitions applied by the economists, and ‘bottom-up’ approaches where participants are asked open-ended questions to gain insight into how the public describes their own understanding of economic concepts. The former “top-down” approach may not take into account understanding held by the public that does not fit into the set definitions used. One of the main findings of the bottom-up literature reviewed was that people often understood the economy using metaphors (e.g. as a machine). The use of metaphors may result in the public simplifying and misunderstanding complex topics. They may also be erroneously confident in their understanding of the topic by assuming that the economy will follow the behaviour of the metaphor they utilise (see: Runge & Hudson, 2020).

3.3 Contextual Knowledge

Understanding and interpreting statistics involves not only skill-based knowledge such as the components captured above but also wider contextual knowledge to put statistics into perspective. This type of knowledge was assessed in a widescale study of European’s knowledge of official economic statistics (Vicente & López, 2017). Data was collected as part of the Eurobarometer 83.3, carried out on behalf of the European Commission, and including ~28,000 individuals aged over 15 years in 28 EU Member States.

Respondents were asked questions about economic statistics within their own country (e.g., “Do you think that, in your country, the inflation rate in 2014 was higher, lower or equal to the rate in 2013?”).  Around a quarter of people in the United Kingdom responded “don’t know” about each of the official figures which were growth rate, inflation rate and unemployment rate. This suggests a substantial proportion of the population may lack the necessary contextual knowledge to put economic statistics into perspective.

The results of this study highlight the importance of including important contextual information in statistical communication to engage with a wide-ranging audience of different knowledge levels.

3.4 Statistical Literacy

This section of the report has so far captured evidence on how researchers have attempted to establish knowledge, within the general public, on topics which were identified in the first section of the report to be linked to statistical literacy. As previously mentioned, researchers have also developed tools to measure statistical literacy as a whole. This section summarises instances of when these tools have been applied to non-specialist audiences.

The Statistical Literacy Indicator, introduced in the previous section, was applied to evaluate the use of statistics in national newspapers (PARIS21, 2021). The results for the United Kingdom indicated that 18.8% of newspaper articles engaged with statistics at Level 1 in 2020 (non-critical). This is higher than in previous years (starting from 2016) when the percentage was typically between 10-13%. Only 0.3% of articles had text classed as Level 2 (critical) engagement with statistics, which was marginally lower than in previous years where the percentage ranged from 1.1-1.5%. A similar pattern was observed for Level 3 (critical mathematical) where the percentage was very low at 0% in 2020 but had been somewhat higher in previous years (0.9-1.9%).

Overall, the results suggest that mention of statistics was higher in newspaper articles in 2020, the beginning of the COVID-19 pandemic. The use of critical language related to statistics did not rise from previous years and was at a very low level overall. Although, these results are described as preliminary before checking by analysts of National Statistical Offices (NSOs).

When establishing the REALI assessment, data were collected for 671 students at the graduate or undergraduate level (Sabbag et al., 2018). As previously mentioned, this assessment distinguishes between statistical literacy and statistical reasoning. The results indicated that the mean scores out of 20, for the statistical literacy and statistical reasoning subscores, were 13.15 and 11.01 respectively. The knowledge of students at college or university is unlikely to be representative of the general population. These data were collected to evaluate the questionnaire in students, who are the target for this tool and so meets that aim but is less applicable for the wider population.

This limitation is also relevant to the data from the BLIS assessment (Ziegler & Garfield, 2018), which was applied to 940 college students who scored on average 21.41 out of 36. These data were collected to evaluate the assessment for students and the results do not apply to the wider public. Both, the REALI and BLIS assessments have not, as far as was identified in the review, been applied outside of a student population. Based on the conclusions of the previous report section, it is likely that there would be different areas of interest when assessing statistical literacy in adults as these two education-based assessments appear to focus predominantly on knowledge of statistical concepts.

Statistical literacy was also assessed in an international survey conducted by the W. M. Keck Statistical Literacy Project in 2002 (Schield, 2006b; described further in the “Action” section). This statistical literacy survey focused on the skill of reading graphs and tables of rates and percentages in 191 participants made up of US college students, college teachers (worldwide) and data analysts in the US and South Africa. The results indicated high error rates, for example, when participants were asked a question comparing two percentages there was a high and similar error rate among students and college teachers (student: 82%; teacher: 81%) while data analysts performed better (60%). Although this is described as a statistical literacy survey it is focused on just the ability to read graphs and tables of rates and percentages. It is also applied in a relatively small sample and was conducted twenty years ago. Therefore, the findings of this study should be interpreted with caution and should not be generalised to the current population.

3.5 Other knowledge

The previous section of this report (“Definitions”) highlighted that there are a wide range of concepts linked to statistical literacy, particularly when considering official statistics. This included being able to access statistical reports and wider knowledge of the statistical system.

Research was published this year, which was commissioned by the UK Statistics Authority, on the topic of public confidence in official statistics (Butt et al., 2022). Results are based on a survey of adults aged over 18 years, recruited to be representative of adults in England, Wales and Scotland, that was conducted by the National Centre for Social Research (NatCen). This survey has been conducted regularly since 2004 and the results of the 2021 survey showed that 75% of respondents (other than “don’t know” or no response) had heard of the Office for National Statistics (ONS) and 36% reported using ONS statistics. Furthermore, 64% of respondents thought official statistics were easy to find and 67% thought they were easy to understand. People were also asked how well they knew the UK Statistics Authority (the Authority) and the Office for Statistics Regulation (OSR). The results indicated that 48% of respondents had heard of the Authority (2% knew it well) and 41% had heard of OSR (2% knew it well). Overall, the survey indicates some awareness of statistical bodies in the UK. It also indicated that the majority of the respondents agreed that official statistics are easy to find and understand. A large minority disagreed with these statements. People with higher levels of education were more likely to agree and so were people aged 35-44. Whereas people aged over 65 were more likely to disagree.

3.6 Summary

In this section of the review, great variability was observed amongst the general public, in the skills that are linked to statistical literacy. Studies often found that skill level was influenced substantially by demographic factors such as age, gender, and education. More up-to-date, large-scale surveys are now required to provide ongoing insight into the knowledge level of the population. No evidence was found that aimed to capture the multidimensional nature of statistical literacy. This could be due to the breadth of statistical literacy which makes it difficult to define, as identified in the previous section of this report (“Definitions”).  Overall, this section highlights the importance of generating statistical information that can be understood by audiences with varying skill levels. The factors to consider when developing statistical communication, to achieve greater understanding and engagement, are explored in the next section of this report.

Back to top