Transcripts of the workshop recordings were coded by a qualitative researcher from Kohlrabi Consulting. Based on these codes, two researchers from Kohlrabi Consulting identified key themes and subthemes, along with supporting quotes and examples for context. Themes were validated by workshop facilitators and representatives from ADR UK and OSR; workshop facilitators also contributed to analysis.
The preliminary analysis was presented to the Project Advisory Group ahead of the follow-up workshop to identify areas requiring further clarification. Participants in the follow-up workshops reviewed initial themes and explored outstanding questions. The findings of the follow-up workshop were coded and incorporated into the analysis by researchers, allowing for both interrogation and validation of results.
Strengths and Limitations of the Project Design
The recruitment design for this project aimed to break down barriers to participation, including accessibility of instructions, affordability, feeling welcomed, and flexibility for those who had caring responsibilities. Many participants fed back that they were pleased to have the opportunity to contribute to this topic.
The deliberative design of the workshop enabled participants to incrementally build up their own understanding of the project themes, coming from little or no understanding. It also provided a space for discussion-based learning. The workshop evaluation forms suggested participants felt they were adequately informed to confidently participate in the discussions. Participants particularly valued hearing directly from representatives from ADR UK and OSR, whose explainer presentations aimed to familiarise participants with the existing processes and protocols associated with public good use of data for research and statistics.
Despite the strengths of this project, there were also some limitations. The sample of participants chosen for this project does not represent the demographic breakdown of the UK public (this was not a condition of recruitment) and a small sample cannot represent the entire British public. Further, the fact that each workshop lasted one day meant that the discussion did not explore all possible relevant topics. For example, there was little opportunity to explore research ethics and integrity, and to probe further every time participants referred generally to ‘organisations’ without clarifying what they meant. However, the findings offer novel insights into a topic which required exploration, and provide a resource for further developing the understanding of public good.
Despite efforts to replicate the workshop experience for each nation and online, a significant change was made following the first workshop in London. Representatives from ADR UK and OSR remained in the room for the duration of that workshop. Upon reflection, it was decided that they would leave the room after delivering their explainer presentations for subsequent workshops. This was to allay any potential concerns of participants possibly feeling uncomfortable articulating distrust or criticism towards either ADR UK or OSR.
Equipping members of the public with the understanding required for deliberative discussion may carry the risk of silencing participants’ instinctual responses to the material presented. To ensure a balance between informed deliberation and more instinctual perspectives, there were two activities aimed at exploring participants’ interpretations of ‘public good’ and their experiences of data collection and use at the start of the workshops before the ADR UK and OSR explainer presentations. This structure enabled facilitators to track if and how participants’ views on public good changed as they learned more about it in the context of the use of data for research and statistics.
Similarly, although group discussions help to build understanding, they may carry the risk of some participant’s views not being heard. Use of repeated questions in different formats throughout the workshop, including written responses, in addition to calling on each participant to speak supported researchers to draw out individual perspectives.
Interpretations of ‘Public Good’
Whilst ‘public good’, ‘public interest’, and ‘public benefit’ are phrases sometimes used interchangeably in the literature (Cowan & Humpherson, 2020; Waind, 2020), participants shared different interpretations of these phrases suggesting that they largely considered ‘public good’ to mean something different to ‘public interest’ and ‘public benefit’.
Analysis of the language participants used when speaking about ‘public good’, compared with ‘public interest’ and ‘public benefit’, suggested that ‘public good’ meant a wholly good thing, while ‘public interest’ was often understood as something interesting to the public, but not necessarily something good, while ‘public benefit’ was often considered as a tangible benefit, but not always good for the wider public.
Participants associated words such as ‘advancing’ and ‘improving’ and ‘knowledge’ with ‘public good’, which seemed to indicate an understanding of public good being theoretical sometimes, or involving incremental change. The form that ‘public good’ could take was discussed within the context of achieving ‘the greatest good’ (see Real-World Needs), which indicated that theoretical or actual changes needed to be wholly positive. However, the terms ‘public benefit’ and ‘public interest’ seemed to focus on current situations, rather than theoretical or future ones. In contrast, participants expressed that ‘public benefit’ was something tangible but not necessarily good, or achieving the greatest good. But they associated ‘public good’ with positive impact, regardless of whether it materialised immediately or in the future (see Real-World Needs).
Words such as ‘research’ or ‘data’ also carried slightly different meanings for some participants, dependent on personal histories or knowledge. An observation of this discussion is the public may consider commonly used phrases related to data and statistics to have different meanings, which suggests those communicating about data and statistics should not assume knowledge or shared understanding when communicating to the public.Back to top