4. Facilitators and barriers
Having established that there are multiple places across the policy lifecycle where there are important and widely used applications of official statistics, we now look towards some of the facilitators of, and barriers to, using official statistics in policy that have been discussed in the literature. Many of the papers reviewed gave fascinating insight into a range of factors that may either facilitate the use of official statistics within UK policy or act as potential barriers to their use. These factors may be broadly categorised under two themes:
- People
- Capacity and capability: whether policy team members can effectively analyse, use, and interpret official statistics.
- Collaboration: whether people who have access and capability collaborate with people who benefit from the insights that statistics provide.
- Production
- User engagement: whether user needs are identified, and an effort is made to promote awareness of what statistics are available.
- Relevance: whether official statistics address relevant topics and are measured in a way the users require.
- Frequency and timeliness: whether official statistics are produced at appropriate intervals, and in a timely manner.
- Accessibility: whether users can easily find and view statistics, and the extent to which they can relate to them.
Many of the factors described in this section of the literature review are intertwined. For example, capacity and capability barriers can be addressed with collaboration; collaboration can be a route to improved user engagement; and understanding user need through engagement can support meeting user need. As such, despite being presented as discrete sections, similar themes are repeated across multiple areas.
Back to top4.1 People
In this literature review, people factors are characteristics and actions tied to the individuals, teams and organisations involved in producing and using official statistics, which can influence their use in policy. These factors have been brought together to demonstrate the importance of people in ensuring statistics meet their potential to inform policy.
4.1.1 Capacity and capability
The capacity for policy team members to effectively analyse, use and interpret official statistics is seen across the literature as an important factor in statistics being used in policy. This factor can act as a facilitator or barrier, depending on the alignment between the required and available capacity and capability. While well-resourced and appropriately skilled team members are seen within the literature to facilitate the use of official statistics in policy, a lack of capacity or capability in interpreting statistics is highlighted as a clear barrier. It is important that policy teams readily understand some of the implications and limitations of official statistics and the methods used to produce them.
The issue of capacity and capability was frequently discussed in the papers assessed as part of this literature review. In one such paper, Killick et al. (2016) explored the associations between census data and policymaking. Although capacity and capability were not the focus of the paper, it provided useful insights from interviews with a selection of policymakers in Scotland. The authors noted from these interviews that statistics from the census were not used very much in policy. One of the potential reasons for this suggested by the interviewees was a lack of well-developed information skills enabling individuals to effectively extract value from information sources within policymaking. As this has also been observed previously (for example, by Forbes and Keegan, 2016; Greyson, Cunningham and Morgan, 2012; Foreman and Thomson, 2009; and Forbes, 2009), the authors suggested that exploring the statistical capability of policymakers would be a fruitful avenue for further research as it has received little research attention to date.
Further evidence of the importance of capacity and capability was demonstrated by Oliver and de Vocht (2017), who surveyed health policymakers about their use of evidence. The authors reported a lack of capacity for analysis, especially in relation to local data. For example, the policymakers discussed wanting assistance with more-sophisticated local data analysis to better understand how to use local data to model inter-group care pathways and the interplay between these and other local factors. Indeed, 31% of policymakers interviewed, discussed a need for improved interpretation of existing data. They expressed a clear need for reliable experts to explain the importance of reports and interpret data to enhance understanding. These comments reinforce the conclusion that a lack of capacity and capability for analysis in those developing policy may be a barrier to the use of data in this process. One approach to resolving this is explored in the next section of our literature review on collaboration.
This section has considered the effects of capacity and capability in using official statistics on public policy. Well-resourced and appropriately skilled staff facilitated the use of official statistics in policy, whereas a lack of capacity and capability to interpret statistics acted as a barrier to this.
4.1.2 Collaboration
Another factor that can both facilitate or act as a barrier to the use of official statistics in policy is the level of collaboration between people who have access and capability and people who might benefit from the insights that statistics provide. Backer (2003) broadly defined collaboration as bringing together multiple organisations to accomplish some form of systems change. This view is enhanced by Spoth and Greenberg (2011), who suggested that collaboration can be useful for focusing attention on an issue and weaving the efforts of separate organisations together.
Strong collaboration can break down barriers related to capability and capacity. This is seen in the paper by Oliver and de Vocht (2017), where policymakers requested assistance from analysts to support sophisticated analysis that they did not have the capability to perform. Collaboration between policymakers and analysts could facilitate such analysis. This is echoed by Holt (2008), who expressed hope that, in future, statisticians would be empowered to offer professional advice to policymakers, making them aware of some of the pitfalls of using poorly chosen statistics as performance indicators and concentrating unduly on them.
Collaboration between analysts and politicians could address capability issues. A need for this collaboration is highlighted by Allin (2017), who proposed that there may be limited statistical support available for politicians. As policymakers put the will of politicians into practice, if official statistics are to be used in policy, it is crucial for politicians to also have appropriate support. Despite reflecting on an overall lack of statistical support, Allin (2017) highlighted the Research and Information Service within the UK Parliament as a useful existing support system, reinforcing the view that statistics use is facilitated when statistical experts work closely with those at the heart of policymaking.
Collaboration is beneficial beyond addressing capability and capacity deficits; OSR, in its report on analytical leadership (2024), also recognised the value-adding properties of organisational collaboration for analytical leadership. The authors suggested that collaboration can help government to identify the key analytical questions for policy and bring the data and evidence together to help answer them. OSR outlined a range of case studies in support of this assertion. For instance, Public Health Scotland (PHS, 2023), a new public health body comprising multiple professions, was conceived during the COVID-19 pandemic. By working alongside data managers, epidemiologists, academia, policy and Scottish Government officials, PHS statisticians could foster a common understanding of the value of PHS statistics and data. This collaborative approach resulted in valuable, high-quality and coherent data and statistics that proved to be central to informing government decisions (such as providing clear insights about the interpretation of changes to the measurement of COVID-19 infections).
Collaboration therefore facilitates the use of statistics in policy through multiple avenues. In considering what underpins successful collaboration, appropriate funding has been deemed a significant factor, such as in another case study outlined by OSR (2024). ONS and the Department of Health and Social Care and the Department for Work and Pensions linked census data with primary care and welfare benefits data to identify deprivation and ethnic differences in benefit recipients before and during the COVID-19 pandemic. This work was funded by HM Treasury’s Shared Outcomes Fund, which was created to incentivise government departments to work collaboratively across challenging policy domains (see HM Treasury, 2021). This example demonstrates how appropriate funding can support collaboration, which subsequently facilitates statistics use in policy.
Successful collaboration was also described by Jenkins (2017), in a paper which observed collaboration between ONS staff, representatives of central government departments, academics and think tanks during the National Wellbeing Measurement Programme. Jenkins argued that panel meetings, circulated documents and formal or informal communications enabled interactions and ultimately helped effectively analyse and interpret statistical information through the lens of policy. This example highlights practical activities which underpin successful collaboration in ensuring relevant analysis and therefore may facilitate the use of official statistics in policy.
Back to top4.2 Production
After exploring factors related to people and how these can act as facilitators of or barriers to statistics being used in policy, this literature review now moves to describing elements of the statistical production process that influence success in this area.
4.2.1 User engagement
User engagement that facilitates statistics use in policy can be broad, including both engagement with direct statistics users and engagement with the public, as those impacted by policies. User engagement could inform policy by:
- identifying what the user needs are, so they can be met with statistics
- making people aware of what statistics are available so they can be used in policy,
- building confidence in the statistics
The significance of user engagement was outlined in Lehtonen’s (2013) paper, which blamed poor collaboration between academics, advocacy groups and regional policymakers for the poor uptake of a range of environmental sustainability indicators (ESIs). Lehtonen suggested that a lack of collaboration and partnership meant that user needs were poorly understood. This sentiment was echoed by the UK Statistics Authority’s assessment of the ESIs (UKSA, 2009), which specifically highlighted the lack of user engagement as a major obstacle preventing the ESIs from acquiring the status of accredited official statistics (previously called National Statistics). Many interviewees emphasised that the ESIs were only a (relatively minor) part of the evidence base underpinning energy policy.
While only providing a brief outline of how user engagement can act as a barrier or a facilitator, this section demonstrates its importance. Effective user engagement is also integral to the additional production factors discussed in more detail below.
4.2.2 Relevance
To facilitate official statistics’ use and impact on policy, they need to be relevant to their users. In other words, official statistics producers should ensure the statistics are produced on relevant topics and that they are measured in the way that users require. As noted by Lehtonen (2013), user needs may be poorly understood when user engagement is flawed (see section 4.2.1), which may lead to a lack of consideration of official statistics by policymakers. This highlights the importance of properly understanding and responding to user need.
4.2.2.1 Ensuring statistics are produced on relevant topics
User need remains unmet when statistics are not detailed enough to provide useful insight, or when they do not exist at all on the topics where users have a need for them. For example, Hodder and Mustchin (2024) discussed the increasing levels of strike activity in the UK and how policymakers usually turn to official data on labour disputes to help clarify the situation. However, the authors suggested that a lack of relevant official statistics leads to a reliance on union-kept strike databases, which are in different stages of completeness. Hodder et al. suggest that whether they come from official or non-official sources, more data are needed to assess the incidence, extent and causes of strikes in order to make effective changes to policy. Similarly, many responses from end users who participated in the Welsh Government (2014) consultation on its statistics, described how they would like data enabling them to evaluate specific policies but that they simply do not exist, thus undermining their ability to monitor policy concerns. These examples demonstrate that a lack of relevant statistics can inhibit their potential to inform any stage of the policy process.
The converse is also true: just as user needs without corresponding statistics hinder statistics’ impact on policy, so too do statistics which are produced with no corresponding user need. This is illustrated by Corlet, Druckman and Cattaneo (2020), who described limited evidence of the widespread use of the indicators produced by the Measuring National Wellbeing (MNW) programme in driving UK policy. The authors argued that the MNW programme was explicitly focused on measuring well-being, with no clear commitments being made about how the measures would be used and by whom. The authors pointed to only a handful of concrete examples of the use of the MNW indicators to assess a specific policy problem (such as for the assessment of airport schemes – Price Waterhouse Coopers, PWC, 2014). As such, if statistics are not produced with the goal of being relevant to policy users, this is unlikely to be an area in which they are used.
4.2.2.2 Ensuring statistics measure concepts in the way that users require
Statistics may also fail to meet user need if the concept they measure does not align with user expectations. This is seen in the literature on the National Wellbeing Measurement (NWM) programme. Oman (2020) described how following consultation on the first iteration of the NWM, many responses referred to the arts specifically (ONS, 2012), with the most commonly requested additions during consultation being measures to reflect arts, culture and sport (Self and Randell, 2013). However, the second iteration of the NWM continued to exclude a measure for arts and culture. Oman concluded that ONS statistics did not fully capture what matters for wellbeing based on the broader debate, and that this affected the evidence base for policy as only a selective evidence base was used to shape policy discussions.
Furthermore, although ONS talked about engagement with policy departments during the development of the programme (see Matheson, 2011), Corlet et al. (2020) noted that the contents of this engagement went undocumented in the ONS archives, meaning the wider community did not have access to these insights. This example demonstrates how not meeting user needs in terms of the concept collected (in this case not capturing wellbeing as broadly as users felt it should be captured) may undermine the value that statistics can provide to the policies they inform and that methodological transparency is essential in this process.
4.2.3 Frequency and timeliness
Even if statistics are produced on topics of interest, this does not necessarily mean they meet user needs. The frequency with which statistics are released is also important. Allin (2021) cited issues with using the census for policymaking as decision makers usually need census-type information much more frequently than every 10 years to fully understand the changing nature of the UK population and migration to and from the UK. These are often ‘hot’ political issues for which data need to be published more frequently to be useful and relevant in a continuously changing society. Infrequently released statistics can mean that although statistics exist, they are unable to accurately inform policies as the insights they convey may be outdated.
Further evidence around how unmet needs act as a barrier also comes from the Welsh Government (2014), who published a consultation with end users of its official statistics, many of whom were involved in policy. Many responses pointed to the importance of the timeliness of these data for policy decisions. This evidence showed that statistics users felt more-timely insights could better support policy decisions. For example, timely economic modelling was proposed to allow better-informed policy decisions and make more effective economic interventions. While users may not always prioritise timeliness, when this is an identified need, it has potential to significantly affect how impactful statistics are for policy.
For a slightly different perspective on the importance of timeliness, some authors further afield than the UK have also considered the use of additional data to help improve timeliness. For instance, several papers (Ahas, 2010; Saluveer and Ahas, 2010; Tiru, 2014) have outlined the use of mobile phone network data in various official statistics offices throughout the European Union. De Jonge et al. (2012), for example, showed that it was possible to use such data to draw relatively timely conclusions about the locations and movement flows of the population within a small area. This shows how in some situations, new data sources may contribute to timely data and therefore help better meet user needs.
4.2.4 Accessibility
A final consideration concerns how accessible statistics need to be, and to whom. As explored in the section on capacity and capability, there are instances where statistics and data in their current form are not accessible for non-analysts (such as those developing policy) to use. Several papers outline the importance of technology to present data in a more accessible format, which may facilitate the use of official statistics in policy activities.
The use of geospatial analysis was outlined in MacDonald et al.’s paper (2023) as a method for optimising coverage of COVID-19 test sites. Digital dashboards presenting data in an accessible format are also becoming more widespread (such as Public Health Scotland’s dashboard showing daily COVID-19 cases in Scotland – see PHS, 2023). Hill O’Connor et al. (2023, p.8) also highlighted the importance of using digital dashboards to track changes and assess progress. In this study, one interviewee described data dashboards as the ‘new bible’ within Greater Manchester. They described finding real value in having a dashboard of important metrics available.
Overall, evidence considered in this section of our literature review shows how there are a range of user needs, which, if not identified and addressed, can compromise the potential for official statistics to benefit policy.
Back to top