Key decision points

This section discusses approaches to public involvement and engagement at four key points in statistics production.

Back to top

What statistics should producers prioritise?

Involving the public as you are thinking about developing new statistics or refining existing statistics can help to:

  • better meet the needs of people who will ultimately use or be impacted by your statistics
  • support better participation in surveys or other primary data collection
  • ensure proposed data collection, reuse of data, or use of administrative and operational data, is publicly acceptable
  • demonstrate trustworthiness through transparency
  • ensure the delivery of value for money by maximising the value of statistics and reducing the cost of collection error, data cleaning and respondent follow-up

You should proactively engage: statistics users; those the statistics are about and whose data is used; and those impacted by the use of the statistics. Below is a list of some of the kinds of questions you might ask and methods you could use to ask them.

Example questions:

  • What do you see as the benefits of having these statistics? How might you use them?
  • What would be lost if these statistics weren’t published? What would you use instead, if you didn’t have them?
  • How comfortable or uncomfortable are you about these statistics being published? What do you see as the risks around how these statistics might be used or misused?
  • What problems do you see (if any) with reusing this (e.g. administrative) data to create these statistics?
  • What would make this survey quicker and easier for people to complete?
  • In your opinion, are the benefits of these statistics worth the costs and risks? Is this something the public sector should be doing?

Example methods:

  • Standing advisory panels that include general public members with lived experience
  • Post-survey questions or conversation with participants during data collection
  • Opinion polling on the reuse of data to create specific statistics
  • Citizens’ jury or other deliberative approach with witnesses arguing from different perspectives about the utility of statistics
  • Public consultation, following consultation principles, to understand views around value of new or existing statistics

Making the following information available will support engagement:

  • Who are you? What are you trying to achieve as an organisation from this engagement, how, and what is the likely outcome?
  • If you are planning a new collection: What statistics are you aiming to create? Why? Who made the decision to create them? Who are you hoping will use them?
  • If you are refining existing statistics: What statistics are you refining? Why? Who made the decision to refine them? What do you already know about who uses them and how are you hoping that will change?
  • What evidence do you have that these statistics are useful? How do they change things?
  • What does it cost to provide these statistics? What value do they create?
  • What data is or will be used to create these statistics? Where is that data coming from and how will it be collected?
  • Who else will have access to this data? Will it be used for other purposes?
  • What options do people have for opting in or out of the use of data about them? What are the consequences if people opt-in or opt-out, both for them and more broadly?
  • How will these statistics be shared? When, where and at what frequency?

 

Case study 2

The Department for Work and Pensions is developing a new poverty measure named ‘Below Average Resources’ (BAR) based on the approach proposed by the Social Metrics Commission (SMC). As part of developing the new Official Statistics in Development a user consultation ran for 12 weeks, from 18 January to 11 April 2024.

The consultation elicited 35 responses, with support for the value added by the Below Average Resources measure alongside existing poverty measures, but differing views on how to develop the framework in practice.

Source:

Response to the consultation on the Below Average Resources measure

 

How do people want to be represented in your statistics?

People have a right to be represented in statistics with accuracy and care. The terms and categories we use in statistics influences how people are seen and thought about. Statistics producers may be unaware of the nuances of different models, ways of thinking, and language, particularly when it comes to under-represented groups. Involving the people who belong to those groups in how they are represented in statistics helps to:

  • support better participation in surveys or other primary data collection
  • improve data accuracy
  • better reflect the diversity of the population
  • better meet the needs of people who will ultimately use or be affected by your statistics
  • demonstrate trustworthiness through respect and inclusion

You should build on existing harmonised standards and guidance developed by the Government Statistical Service, but may also need to proactively engage under-represented and (or) or disproportionately affected groups reflected in your statistics. Below is a list of some of the kinds of questions you might ask and methods you could use to ask them.

Example questions:

  • How can we best collect data about your community or demographic to ensure people like you are accurately represented in official statistics?
  • How do you want to be represented? e.g. which model of disability, which ethnicity groupings?
  • How comfortable or uncomfortable would you be with answering these survey questions? What forms of these questions would you not respond to?
  • For which aspects of your personal identity would an open text box be more appropriate?
  • How should your community or demographic be described? What other information (if any) do you think should be provided alongside these statistics to give context?
  • What tailored reports or outputs (if any) would be useful for your community?

Example methods:

  • Standing advisory panels including organisations representing particular under-represented and (or) or disproportionately affected communities
  • Focus groups with members of under-represented and (or) or disproportionately affected groups
  • Co-creation or participatory design workshops to develop standard terminology
  • Community researchers conducting peer-to-peer research
  • Cognitive interviews with people in target respondent groups

When deciding what to ask members of the public, you must consider what is feasible and appropriate in your situation. For example, if you do not have processing capability or subject matter expertise to analyse free-text data then you should not ask participants where they would like to express themselves through an open text box. Similarly, if you need to collect disability data in line with the Equalities Act (2010), you may need to use the relevant harmonisation guidance for measuring disability. This could prevent you from representing some people in line with the disability model that they prefer. Therefore, in this situation, you should be clear about the constraints you are working within to avoid misleading the public. Understanding their preferences may still be valuable for how you describe their data, however you should align the specific questions you ask with the scope of their influence. Further detail on being honest about the scope of public influence is described in tips for effective involvement and engagement.

Making the following information available will support engagement:

  • What models, groupings or language are you currently using to represent people with different identity characteristics? Why do you use them?
  • What do your statistics say about different groups? What tailored reports do you currently provide?
  • What are survey or data-collection response rates from different groups? What methods are you using to reach those groups? Are you using user-centric design approaches?
  • What operational, technical, legal and organisational considerations will influence how you can collect and report data about the characteristics and (or) or communities of interest?

 

Case study 3

The Race Equality Foundation and the Office for National Statistics were commissioned by the Wellcome Trust to look at different aspects of the recording of ethnicity in healthcare.

Forty one participants from Black, Asian and minority ethnic communities were recruited via voluntary and community sector organisations in three geographical areas, with each participant taking part in three focus group sessions. Half the sessions included a community language interpreter and the other half were held in English.

The research also included 46 healthcare workers who participated in focus group sessions or one-to-one interviews, including from primary, secondary and community health settings, and including nonclinical staff and clinical staff.

The research found that communities want to be actively involved in creating data collection processes.

Source:

Improving the recording of ethnicity in health datasets

Analytical Learning Points for Ethnicity Data in Health Administrative Data Sources

 

Case study 4

The Office for National Statistics conducted extensive stakeholder engagement, research and testing over more than three years to inform the development of the Census 2021 questionnaires. One of the areas they looked at was questions about health and unpaid care.

The questions were refined through a combination of focus groups; informal interviews; in-depth interviews (for example, cognitive interviews, paired depth interviews); peer reviews; and user experience (UX) research. These were supplemented by testing the questions with sample participants.

The research led to changes in the usage of particular words (such as “conditions” rather than “problems”), the phrasing of questions, the options that people could choose as responses (such as when answering how many hours they worked as unpaid carers), and the overall flow of questions in the census.

Sources:

Health and unpaid care question development for Census 2021
Question and questionnaire development overview for Census 2021

 

What would help people trust us with their data?

Understanding the social acceptability of different approaches to collecting, storing, using and sharing data helps to:

  • demonstrate trustworthiness through protecting people’s privacy and agency
  • improve participation in surveys or other primary data collection
  • ensure the delivery of value for money by maximising the value of statistics

You should proactively engage the people from whom you collect data directly and those whose administrative data is used to produce statistics. You may want to particularly focus on hearing from seldom-heard groups. The following table lists some of the kinds of questions you might ask and methods you could use to ask them.

Example questions

  • What uses of your data are you currently aware of or assume happens?
  • How would you like data to be used? What motivates you to share data?
  • What harmful uses or misuses of data do you want to avoid? What motivates you to not share data?
  • What do you need to know about how data is being protected? What would restore your confidence if there was a data breach or accidental share of data?
  • What do you need to know to feel more informed about how data is being used? How can we best share that information?
  • What information do you feel you need to make decisions about your data? What principles should apply to those decisions?
  • Are there any organisations you trust to hold us to account?

Example methods

  • General public membership on ethics panels making decisions about data reuse
  • Focus groups or workshops with particular communities to develop evidence about public acceptability
  • Online surveys to understand public attitudes to different uses of data
  • Mini-public deliberations to create principles for what data can and cannot be used for
  • Co-design of website pages providing transparency information

Making the following information available will support engagement:

  • What data are you using? Where does it come from?
  • How are you keeping data safe and secure? What data governance processes are in place to manage the ethical use of data?
  • What techniques are you using to anonymise data? What are the risks of re-identification and how are those managed?
  • Who has access to the data you hold? How are decisions made about how it is shared?
  • Have any data breaches or misuses happened in the past? What happened as a consequence?
  • Which organisations monitor and provide independent oversight over the ways you collect, store, use and share data?

 

Case study 5

The UK National Data Guardian (NDG) and Connected Health Cities commissioned Citizens’ Juries c.i.c. working in partnership with the Jefferson Center (Community Interest Company) to explore when people would normally expect their confidential patient information to be used and shared and when they would expect it to be kept private.

Eighteen people took part in the three-day “citizens’ jury” to test a number of scenarios where patient data could be shared and judge whether it was reasonable for a patient to expect the information to be shared, or to expect privacy. The jury members heard from, and asked questions of, expert witnesses, and worked in groups on the jury questions. They reached conclusions together and were polled on their individual views to identify criteria for sharing or not sharing data.

Source:

Talking with citizens about expectations for data sharing and privacy
Patient and Public Involvement – Connected Health Cities

 

How should we make statistics usable?

Involving the public as you are designing statistical reports and outputs helps to:

  • better meet the needs of people who will ultimately use your statistics
  • prevent the misinterpretation or misapplication of statistics
  • ensure that people can critically examine statistics and relate them to their own lives
  • expand the diversity of those who can effectively use statistics
  • ensure the delivery of value for money by maximising the value of statistics

You should proactively engage existing and potential statistics users of different types, including researchers, journalists, policymakers and the general public. The following table lists some of the kinds of questions you might ask and methods you could use to ask them, though there are many other user research methods that may be applicable.

Example questions

  • Where do you look for statistics? Are you able to find what you are looking for easily?
  • What statistics do you use? What other statistics or tables do you need or want? What would these support?
  • What level of detail is helpful in these statistics? What level of timeliness and comparability?
  • What other information (if any) do you combine with these statistics?
  • How do you prefer to engage with statistics? Do you look at datasets? Read the reports? Use our dissemination tools? Do you do it alone or in collaboration with others?
  • What language (in statistical releases) makes sense to you? Are there any aspects or terms that are particularly confusing or unclear?
  • What are the barriers to you using official statistics?
  • What makes you trust or distrust these statistics?

Example methods

  • Data diaries recording interactions with statistics over a period of time
  • Cognitive interviews, shadowing, and (or) or data-based observations of people using statistics
  • Surveys of statistics users
  • User-centric design and co-design workshops with statistics users
  • Ongoing advisory panels including representatives of statistics users
  • Standard government consultations: publishing plans and seeking input on them

Making the following information available will support engagement:

  • Where are your statistics and research results published? How do people find them?
  • What statistical outputs do you make available? What do you already know about how much they are used?
  • How frequently do you publish statistics and how timely are they?
  • What breakdowns do you currently make available?
  • What are alternative or supplementary sources of information that complement these statistics?

Case study 6

In 2013, concerns about child heart surgery statistics led to the temporary closure of Leeds General Infirmary’s paediatric heart unit. Christina Pagel from University College London and Sir David Spiegelhalter from the University of Cambridge worked with the charity Sense about Science and experimental psychologist Tim Rakow from King’s College London to create a website that would better explain the risk adjustment method known as PRAiS (Partial Risk Adjustment in Surgery), to avoid its misinterpretation in the future.

They involved families throughout the project to ensure that they used the right terminology and explained statistical concepts in ways people would understand. The resulting website, Understanding Children’s Heart Surgery Outcomes doesn’t compare hospitals to each other but to each hospital’s predicted range, determined by the complexity of the procedures it undertakes, among other factors.

The researchers highlight that accessibility of statistics is crucial, and that incorporating feedback from the public is necessary to building something useful.

Sources:

Parent-led tool opens up NHS children’s heart surgery data to families

Content to consider for your public involvement and engagement plan

Public involvement and engagement should outline the specific topics around which you are intending to conduct public involvement and engagement activities. For each of these topics they should answer the following questions.  You should describe for each question both what you are currently doing and what you intend to do in the future.

  • What questions are you seeking to answer through involving the public? What existing information do you have about those questions?
  • Which people, communities, groups or organisations do you want to hear from? How are you identifying or reaching them?
  • What methods are you using to involve the public around those questions? Are you using different methods of engagement for different groups?
  • What information are you sharing to help inform discussions around these questions?
  • How will the results of this public involvement and engagement change what you do? How are you bringing other decision makers into the process so they best understand public sentiment?

How will you feed back the changes you’ve made to the people and groups you engage so that they understand how their engagement has made a difference?

Back to top
Download PDF version (557.01 KB)