Findings (Expanded)
The following section details the insights derived from this research, supported by anonymised verbatim quotes from workshop participants to illustrate our findings.
1. Public Involvement
Members of the public want to be involved in making decisions about whether public good is being served
Participants repeatedly returned to the question of who decides what constitutes public good.
Things that are interpreting public good, or if it reaches into whether its public good or not, that should be decided by the public.
Workshop participant
The participants felt that, practically, it was implausible for the majority of the public to be formally involved in deciding what public good is within the context of the use of data for research and statistics. Several participants adamantly expressed that the public already have a voice in interpreting what public good is: that of the democratically-elected government who use it on their behalf, though some participants echoed a strong distrust of politicians. This supported discussions about a public role in the guardianship of data and statistics; if the outcomes of their use would affect the society they lived in, then, as citizens, they want their say.
The public good should not be politicised or defined by politicians. I don’t trust MPs; I would rather it was someone working in a coffee shop.
Workshop participant
If they do de-identify it, I’m fine with that. but it’s not okay to just allow people to use data in whatever way they want and for me to say it’s got nothing to do with me. Its use is affecting my life and the lives of my children.
Workshop participant
A shared understanding of the use of data for research and statistics for public good was identified as something which would help people feel part of tacit public involvement and agreement. Across their different backgrounds, the participants established that they had very little knowledge about the use of data for research and statistics before these workshops.
[I want] the conditions for everybody to have this conversation of what underpins the notion of public good. Every citizen should have the ability to make the judgement.
Workshop participant
Participants saw responsibility for building public understanding as belonging to everyone involved in handling data or statistics, but one group attracted particular focus. Participants suggested that the public services which collect data, such as GPs, hospitals, and schools, have a responsibility to communicate the purpose for data collection and how it might be used. Many participants articulated that transparent communication ‘on the ground’ would enable the public to feel respected, and to feel tacitly part of the discussion on whether to use data for public good. It was also wondered whether contentment with disclosure of personal information may increase if services’ motivations for collecting data considered more sensitive, such as ethnicity and religion, were transparent. Participants felt that no public service should assume the trust of the public.
The follow-up workshop made clear that participants wanted at least some members of the public to have been consulted in decisions concerning the public good use of data for research or statistics, as they will best understand public perspectives. This was specified as undertaken on a case-by-case basis, over and above the legal framework for data sharing and use. Although there was general support for the DEA framework, most participants expressed that they did not want public good decisions left only to the government, due to concern about politicised or potentially self-serving interpretations of public good.
It’s important for people to speak for themselves. The government might not know what is public good. The people, where the data is coming from, should be able to say something, and not just the government.
Workshop participant
Practical Applications Suggested by Participants
Participants expressed a preference for members of the public to be included on public panels, which aim to inform whether a use of data is serving the public good. It was important to participants that panels should include representatives from the groups likely to be affected by the data use, and make efforts to include wider perspectives to potentially mitigate unintended effects (see Minimise Harm).
In addition to involvement in decision-making, participants wanted data organisations to listen to the public perspective in a variety of settings, using inclusive methods, across the country. For example, they wanted organisations which produce statistics to proactively seek the views of the public in interpreting statistical patterns, or organisations which enable access to public sector data for research to proactively seek public views in developing and prioritising their areas of research interest. Similarly, researchers from public institutions using such data to meaningfully engage the public with their research.
Participants involved in the follow-up workshop responded positively to being engaged more than once in the project; an ongoing on relationship between project partners and participants was viewed as meaningful.
2. Real-World Needs
Research and statistics should aim to address real-world needs, whether that concerns an issue that may impact future generations and those impacting a small number of people
Participants tackled the question of which public is being served by the use of data for research and statistics. In the first round of workshops, there appeared to be several prioritisations:
- the needs of the largest majority versus the needs of fewer people with ‘higher needs’,
- the current needs of people as opposed to the needs of a future public,
- the needs of people in the United Kingdom compared to people across the globe.
The common principle was the idea of a public good use of data for research and statistics being for ‘the greatest good’ or the most good possible within a given context. This was still interpreted in a number of ways.
Many participants understood ‘the public’ to mean the majority of people. For them, the greatest good was the greatest number of needs being met at once. This was seen as fair as there was trust that outcomes of data and statistics use would eventually benefit everyone so eventually society will be a better place. One participant called this “a domino effect”.
Others felt as strongly that public good use of data and statistics should be aimed at the highest need, regardless of who or how many might benefit. This was particularly driven by concern about social inequity. However, participants were unable to reach consensus on this, with some participants insisting that it was unfair to intend for a use of data or statistics to serve part of society rather than a general public. Emphasis on inequity ran parallel to a desire for equality; an observation that even participants passionate about addressing inequity felt that benefits for people who are better off are still compatible with public good.
Data use for ‘the greatest good’ corresponded with some participants’ belief that ‘the public’ does not have to refer to the ‘current public’. Using data for research and statistics could intend to improve the lives of future generations, or through international data sharing for people across the world.
However, there was no consensus on this point. A few participants disagreed and suggested the use of data for research and statistics should be limited to covering the interests of the people whose data was being studied. This was partly related to the idea of ownership of data because it was suggested that people should see a return on their ‘donation’ of data about them (see Clear Communication).
Everybody should be able to benefit from it. If just a part of people is benefiting from and the rest are still stagnant, it’s not okay, that is not a fair use of data. It’s one-sided.
Workshop participant
Discussions concerning the principle of the greatest good in the follow-up workshop clarified that there was not a set number of people who should be served by the use of data or statistics. Participants agreed that the value of a use of data or statistics could be assessed by need, rather the number of people who would benefit.
Regarding the topic of equitable data use, participants expressed that society is already unequal and inequitable, and that public good use of data for research or statistics should aim to address these issues. Participants also felt that data should not be released for research purposes if it can fuel inequities or inequalities. For example, some uses of data for research may fit into a common interpretation of public good if the application leads to tangible improvements for some, however, participants felt if a consequence of such research were to widen inequality gaps then it would not be considered public good use of data.
Society is unfairly stacked. Things like your postcode, your socioeconomic demographic, or the school you went to can affect your life. You have to recognise this and notions of “public good” should reflect that.
Workshop participant
They should have an understanding of the impact of the projects being proposed, a good custodian of data wouldn’t release data to a project if it is going to fuel inequality.
Workshop participant
Participants communicated that decision-makers granting the use of data for research should have an understanding of the impacts of the project proposed, akin to how local councils fill out an impact assessment for their activities (see Minimise Harm). Participants assumed knowledge would help guide decision-makers and enable fair data distribution.
The follow-up workshop reiterated that participants were happy for public good use of data for research and statistics to encompass more intangible uses such as research for understanding, or challenging or validating established evidence, within the context of achieving the greatest good. Participants felt that society would benefit with greater understanding or better evidence for decision-making (see Real-World Needs and Minimise Harm). Participants spoke about not wanting to get in the way of science by impeding advances in knowledge.
Wisdom and maturity are when you plant a tree knowing very well that you might not be there to enjoy its shade. The point is there are future generations that might enjoy the benefits of that research and data.
Workshop participant
Practical Applications Suggested by Participants
It was very important to participants that publicly-funded research and statistics
using administrative data should aim to address a real-world need, which could include an issue that may impact future generations or one issue impacting a small number of people.
Although participants stressed that no one person should have more of a right to benefit from data for research and statistics than another, addressing social inequality and inequity ranked highly among participants’ interpretations of public good. In the follow-up workshop, participants suggested that in practice, addressing social inequalities and inequities could be prioritised in the process of interpreting whether a use of data for research or statistics is in the public good.
It is important to note that participants did explicitly express distrust for the decision-making process concerning the sharing and use of data, but they desired a better understanding of how and why decisions are made. It was important to them that this process be publicly communicated and that equity be at the core of decision-making.
3. Clear Communication
To serve the public good, there should be proactive, clear, and accessible public-facing communication about the use of data and statistics (to better communicate how evidence informs decision-making)
Participants were not initially concerned about how the public good use of data for research and statistics would materialise; participants expressed wanting to experience tangible changes. Local service provision, national policymaking, and research with clear applications such as clinical health research, were examples participants listed as uses of data and statistics which generate tangible benefits. Most prominently, communication concerning improvement to quality of life was noted as a good use of data for research or statistics.
The participants’ desire to experience tangible changes was fuelled by a sense of disproportionality between the amount of personal data collected about people (ethnicity and religious beliefs was particularly noteworthy), compared with a perceived lack of impact on use of that data to improve people’s lives. To illustrate their frustration, participants gave examples of different statistics related to services, such as that of shortages in the NHS and social care staff, or ethnic profiling by police, which were perceived to not be utilised as evidence to improve services.
I feel like a robot, you know? I feel with the council tax. I feel with my wage is split in half. I feel like I’m giving, I’m not receiving. Working class always struggling. And that’s us. That’s the public.
Workshop participant
Participants argued for decision-makers to publicly communicate that they are proactively using statistics on fundamental societal issues, such as poverty, health and social care staffing, and education. Over the course of the workshops, there was a continuous perception that lack of political will or funding was limiting the potential uses of data to serve the public good (see Minimise Harm).
Because on the news they’ll say, 500 black people out of so many get stopped. but knowing that information hasn’t stopped the police from doing that.
Workshop participant
When we give our data, we don’t ever find what happens with the results. And it’s so important, you know, that we, as the public are made aware. Implementations within policy, or strategy or law even need to be shared for those people who have given that data over.
Workshop participant
Perceived missed opportunities to serve public good, or miscommunication of an activity, were identified as a cause of disengagement between the public and those working with data and statistics; participants understood this as eventually resulting in less public good being served. Participants spoke of hearing about statistics in the newspapers, which they understood as a call to action, and of sharing their data every time they attend a public service, which they understood as burdensome. The perception that no action followed the collection of this data for research or the production of statistics was understood as the motivation for some choosing not to disclose their data with public services. Despite these concerns, several participants recognised that change is a slow process which might be occurring unbeknownst to them.
I’m giving information, but nothing is being done according to my needs, the service provision doesn’t meet any of my needs. Then tomorrow you come again to ask for information and we don’t want to [give it] because we don’t know what you’ll do with the information. Then we become hard to reach communities and the cycle just perpetuates.
Workshop participant
I think a lot more could be done in a sort of more productive way, in a more positive way, with the data, to inform policy and, and to sort of provide a direction in regards to, you know, what sort of areas could be improved with that data.
Workshop participant
It’s important to say to people okay we’ve collected this data and this is what we’ve discovered and this is what needs to be done. ‘We are working on it but we need so much time or money to do this, but this is what the end result will look like. It’s not going to happen today because there is a process, but we are aware that this needs to be done.’ So, feedback.
Workshop participant
Recognising the nuances in this discussion, participants wanted members of the public to be able to understand how data was being used for public good. Proactively communicating the decision-making process for the use of data for research and statistics, and what potential uses these decisions had been considered for was important, for instance why some projects are granted data access over others (see Public Involvement). This was viewed as promoting accountability and transparency.
I wasn’t aware until this workshop about the whole process of decision-making about data. I would like to see how the actual decision is made, what it is measured against, if there was a way that the public could see that, and how you actually find that information. If there’s more openness around the process then hopefully more trust could be built.
Workshop participant
Participants expressed a desire for more transparent and accessible public-facing communication. Participants indicated an interest in information translated into a range of languages, communicated via a range of channels, and packaged in a range of formats that accommodate people with different sight, hearing and learning needs. For participants, this meant prioritising offline methods for information sharing, such as physical information leaflets at places where data is collected, such as job centres or GP practices, each with contact information welcoming questions or feedback from the public about how their data might be used for research purposes. They suggested that online notices or webpages on service websites should also be more accessible.
People need to be able to understand it. I think we need to change how we’re talking to people. It needs to be in a language that the community understands. And not just in language, but for other needs for people that are blind, for people who have dyslexia. I don’t see anything set up that way.
Workshop participant
Participants expressed that greater understanding of data and statistics would enable them to make decisions about their lives, and were interested in increasing their own usage of statistics. Participants wanted to be able to make decisions about their lives using statistics without fear of confusion, misinterpretation, or politically-biased narration. They felt this would empower them to both better understand the use of statistics in decision-making, but also to maximise their own use of statistics for personal decisions.
I think unless you understand statistics, you wouldn’t understand that actually [information was derived] through statistics. And they just give that headline without saying what was excluded from it. Then the public is being misled so it’s not public good.
Workshop participant
Practical Applications Suggested by Participants
Participants wanted statistics and research using administrative data to be communicated in a clear and accessible way to the public. As far as possible, participants felt that the public want to hear what changes to evidence, understanding, decision-making or implementations of policy have resulted from the use of data for research or statistics.
Participants acknowledged that greater awareness of data safeguards, and an understanding of why their data is collected, may yield more support for the public to share their data for research. In these discussions several participants asked “why do they want to know that?” querying why certain data is collected.
Participants perceived a sense of accountability associated with public awareness of the use of data for research and its practices, as it enables the public to make informed opinions. There was a strong appetite for those working with data and statistics to broaden their reach via offline communication, such as information leaflets or posters in public services, to explainer videos and digital infographics displayed prominently on websites.
Participants wanted to better understand the entire process of using data for research or statistics, including why data is being collected, the problem being addressed with the data, timelines of intended activity, and potential (or actual) results. Participants felt that members of the public would like to be able to individually determine whether the use of data for research or statistics had served the public good, to the best of the producers’ ability. This meant having the knowledge to generate an informed opinion. For instance, some participants spoke about how they felt confident making informed decisions during the pandemic based on their understanding of Covid-19 statistics.
Participants also suggested organisations working with statistics should be vocal in mainstream news and communication channels in their critical appraisal of statistics to help individuals make their own assessments of statistics.
4. Minimise Harm
Public good means data collected for research and statistics should minimise harm
It’s also important to consider the tone of communicating. When you are speaking to communities facing multiple disadvantages, living in deprived areas, you must be very careful about the tone of your language, avoiding stigmatising use of the data if the picture it paints is grim, is very important.
Workshop participant
Many participants communicated concern about potential negative unintended effects that a use of data for research, or an interpretation of statistics, could have on interconnected areas of society, either immediately or over time. For some participants, despite de-identification, they saw public sector administrative data as belonging to them and therefore wanted a say in how it is used. Even those who considered de-identified data as “just numbers”, felt a personal responsibility that data about them should not contribute to something harmful, akin to other moral life choices such as reducing our carbon footprint.
Irrespective of whether my name is attached to it, I hate to think that my data is contributing to something that harms someone. If it’s not being used for good, then I’m part of that bad. I want to feel ethically good about what I take part in.
Workshop participant
Participants shared genuine fears that well-intentioned uses of data aimed at particular groups may inadvertently result in negative impacts by stereotyping and discrimination.
If you take a case of 100 numbers, you see Somali, Muslim, you see the good ones aren’t mentioned. You’ve got the good and the bad in everything, but you just see the bad.
Workshop participant
This potential harm appeared to be a greater concern than data breaches or data loss. The Five Safes framework, a set of best practice principles developed by the Office for National Statistics to facilitate responsible sharing and use of data, includes checking that individuals cannot be identified in outputs generated from data analysis (UK Data Service). The Five Safes framework was included in the ADR UK explainer session. Participants reacted positively towards this but wanted a further step to be included regarding responsible use of language. Specifically, more sensitivity around labelling was strongly recommended.
I detect some harmful consequences from the language used in studies. If, for instance, those labelled as deprived and poor areas attract that negative media attention connected to the populations living there in, you know, labelling them as poor, you could subject them to stigmatisation and turn them into targets of attacks.
Workshop participant
Participants were frustrated with experiences of stereotypes and assumptions about others. This related to their discomfort about missing data, as they felt decisions may be made about some people based on assumptions rather than on evidence (see Best Practice Safeguarding). Concern about inaccuracies undermined a growing recognition of the public good of using data for research or statistics to validate or challenge accepted evidence. This was thought to be of value through helping research and statistics be more truthful and representative of the public.
Some participants stressed they did not solely want quantitative data to decide a course of action, as they feared it may be misinterpreted. They emphasised the importance of contextualising data with public engagement; they stated a preference for consulting groups likely to be impacted by the data, to validate quantitative data, rather than designing a policy, service, or piece of infrastructure based on research from data and statistics alone. This might involve, for example, asking local people if they will use a service that the data suggests they might benefit from or asking people with lived experience of an issue why statistics are showing a particular association rather than making assumptions.
And then I think for it to be interpreted by a diverse group of people from different backgrounds and different political opinions and different types of jobs. And from there you can decide where to act.
Workshop participant
These points related to earlier concerns about missing data (see Best Practice Safeguarding), and the desire for data and statistics use for the greatest good (see Real-World Needs). Without people being included in the interpretation of statistics that are used for evidence-based decision-making, those people do not have a public voice and decisions might be made that do not positively impact their needs. One participant felt that if data or statistics are used for the greatest good, that good will eventually reach everyone by improving society as a whole. Others felt that this was not enough; that some people were not being impacted at all. This relates to an earlier finding to improve the representativeness of statistics, research, and decision-making.
There’s a massive amount of missing data, for example on the traveller community, you haven’t got any responses at all, we don’t know what their NHS experiences are like. The risk of that is that decision-making is made without you.
Workshop participant
In terms of more visible harms from data use, of data breaches or data loss, the majority of participants felt the framework of the Five Safes allayed their concerns. The only improvement suggested was greater accountability, or public awareness of existing accountability frameworks, of data protection. In the follow-up workshop, participants wanted to be reassured that each step of the Five Safes was practiced as laid out in the framework. This conversation returned again to independent regulators. It was also suggested that transparent, publicly-available whistle-blowing policies and named leads would reassure the public that organisations using administrative data were taking protection from harm seriously.
There’s still a worry on who checks that they’re being practiced. In other sectors there’s always a clear whistle-blowing process. To show that they are aware that there is a possibility for misuse, that they have got a process for if something happens. The organisations show that they have it at the back of their minds, a platform of what to do.
Workshop participant
Unintended consequences from interpretations of statistics were identified as another risk. Two participants spoke of people they knew changing their behaviour in a way that harmed their well-being after learning about some newly published statistics. For example, one participant knew young people who had begun to carry knives for protection after reading that statistics had shown knife-carrying had increased. Participants suggested that, before publication, statistical outputs should be sense-checked by a diverse group of people as one way of mitigating potential risks.
The statistician doesn’t have that lived experience. We are then assuming that this person has gone out of their way to read up on the challenges of these different groups of people.
Workshop participant
Although some participants were reassured after learning about the role of OSR in regulating statistics for the public good, many still felt that some public figures were misusing statistics without any repercussions. Some participants stated those who misuse statistics should be fined, or have more public notoriety due to their wrongdoing. There was not a defined suggestion for action but expressions of frustration about being misled by public figures.
Practical Applications Suggested by Participants
Participants concluded that potential harms of data use should be anticipated before access to data for research or statistics is authorised, perhaps by consulting a variety of groups or any groups relevant to the data or statistics. Participants wondered if that information could be fed into the public good test that is part of the Five Safes process, adopted by many organisations responsible for making data available to use for research and statistics.
Participants were also extremely concerned about the potential damage of research or statistics outputs if they were misinterpreted. Participants wondered whether members of the public with relevant lived experience could be consulted as part of the interpretation and publication of statistics. This would help data users avoid language or interpretations that might fuel stigma or discrimination, and anticipate any other potential harms. A separate use for this suggestion was to validate the interpretation of the statistics with the population they concern. This might involve, for example, asking local people if they will use a service that the data suggests they might benefit from, or asking people with lived experience of an issue why the data is showing a particular association, rather than making assumptions.
In addition to the Five Safes framework, the participants argued for accountability when harm is caused. Explicit suggestions were that organisations should make publicly available whistle-blowing policies, clear consequences of data misuse, and name the people who are ultimately accountable for any breach of the Five Safes. This is also in the context that whilst the UK Statistics Authority is responsible for the application of the Digital Economy Act as a legal gateway to access data for research and statistics, there is no such body responsible for the overall application of the (much more widely used) Five Safes framework across the UK, though individual checks exist across the stages of the Five Safes framework.
5. Best Practice Safeguarding
Universal application of best practice safeguarding principles to ensure secure access to data should help people feel confident to disclose data
Participants expressed a hope for data for research and statistics to be supplying the best evidence to all decision-makers, including members of the public. Three sub-themes emanated from these discussions: data safeguards, missing data and missed use of data.
Data Safeguards
Despite varying degrees of scepticism around data sharing and use, with some considering de-identified data as ‘just numbers’ and others as ‘my data’, there was consensus that robust safeguards should be in place to protect data and individual’s privacy. The Five Safes framework was shared with participants as an example of how some organisations work with data.
The Five Safes were reassuring to me…For me it was a solid thing, I wouldn’t be afraid of where the data is going after that.
Workshop participant
As detailed in the earlier themes (see Real-World Needs and Clear Communication), participants thought that if public services openly communicated reasons why data was used then they would be more likely to disclose their information. Many participants also articulated that they would feel more comfortable about disclosing data knowing a security framework such as the Five Safes was being practiced.
Missing Data
Maximising evidence was seen as vital to data and statistics being used responsibly. Many participants related missing data at community level, and poor research and statistics, to incorrect understanding and decision-making. Even those who did not like sharing their information wanted to improve, what one participant referred to as, the “the real data”. Although some participants referred to their own discomfort or fatigue with disclosing data to public services, there was a sense that more data could be better used to serve public good. A number of participants perceived that patterns could not be examined to uncover unintended consequences, and plans could not be made for the future, if data was not “integrated” between services (see Real-World Needs).
If you have good data you should be able to see all the links, the dynamics. For example, with competing needs, if you have data about cyclists and about road users, you should be able to meet all the demands.
Workshop participant
Some participants expressed a range of discomfort with disclosing some types of data which relate to how a person identifies themselves. Some participants felt disinclined to provide their data as a by-product of negative treatment by public services, or because services had not explained why they needed data, further underlining the findings in Clear Communication. Participants with teenage children spoke of their teenagers not comprehending why they should provide their data, so instead providing incorrect data or leaving sections of a form blank. A significant proportion of participants expressed concern that demographic data could be used to discriminate, or could feed into the stereotyping of certain groups (see Real-World Needs). However other participants insisted that the existence of structural discrimination meant that data collection is very important.
They’re unnecessary. You’re asking what religion I am, where I’m from, am I really British?
Workshop participant
My 16-year-old daughter came to me saying why do they want to know if I’m straight, if I’m Catholic, is it because I’m gay are they going to give me the job? She couldn’t understand it no matter how I tried to explain it to her. She just thought it was prying. Should I put my weight down? She was getting angry about it. It doesn’t affect how I serve coffee.
Workshop participant
A common assertion was that, if poor representation within data was the result of people failing to disclose personal information, then researchers should make more effort to investigate why some people choose not to disclose their data. It was expressed by participants that engagement with these people could help their voices be heard in decision-making, and progress understanding of how to address missing data.
I would be more comfortable filling in [a form if I knew why] I’m being asked for that information. It would be helpful for organisations like ADR UK to have programmes, short clips, videos, about data and how they use it. And adverts online. Posters. So that everybody everywhere knows it’s actually useful for me to give my data. It’s put into people’s minds on a daily basis in different ways.
Workshop participant
“I’m not disclosing. Instead of looking to make up the numbers go and ask why people aren’t disclosing.
Workshop participant
In the follow-up workshop, participants explored the use of synthetic datasets (see Appendix A Glossary of terms) to compensate for missing data. These were viewed as protecting the public, as they did not contain information on personal identities, while being valuable to science in supporting analysis and researcher development. Participants stated key to synthetic data serving the public good was it not being used in place of real data or used in place of actual evidence, whether that be for service provision, the advancement or correction of knowledge, or policy and decision-making. Participants also emphasised that it should not take away from the need to address the reasons behind missing data.
As reported in the Clear Communication finding, in the follow-up workshop participants articulated that they were happy with validating or challenging existing evidence being a public good use of data for research or statistics. It was felt to fit into the participants’ motivation for truth and honesty in statistics; attempting through research to develop better quality and more up to date estimation of the facts. If updating statistics allowed for more people’s data to be included, some participants were quick to wonder if this would make societal knowledge more representative of society.
Missed Use
The missed use of data, or the deliberate not sharing of data that could be used for research and evidence-based decision-making (Morrow, 2020), was viewed by many participants as harmful. There was concern that the public good could be undermined by having a process that was too lengthy to access data for research, or restrictive as a result of public service organisations or data custodians refusing or delaying access to data. Participants stated data sitting unused in storage was not a good use of resources. Although participants did not want robust data security compromised, they wondered if any part of the process could be sped up for priority issues.
There’s no point in finding out how many children have been put in child poverty a year and a half from now. We need to focus on the data that’s really needed at that time and, and try and push it through to justify spending on the communities.
Workshop participant
Some participants suggested that well-resourced private organisations might be able to maximise what could be achieved on behalf of the public sector. Among this group were people who believed private organisations to be more transparent and more efficient than the public sector organisations. Caveats were that there would have to be adequate restrictions and regulations in place and that public organisations would need to be transparent about the involvement of the private sector, and would need to be able to demonstrate public good values.
If I give my data, I want to see results, I want to see positive results, that’s what I expect, that’s fine. When it is monetary value, or normal interest, it’s fair enough, as long as it’s positive.
Workshop participant
In contrast, other participants were vehemently opposed to private organisations getting involved with public sector data, as they anticipated the profit motive would take precedence over truth.
A number of technical issues in data for research and statistics were explored to understand how they fit into discussions concerning public good. Participants perceived that maintaining datasets (in a secure way), rather than deleting them after use, and re-using datasets to validate research in different populations, served the public good. However, they argued that organisations keeping the data should be a trustworthy public or publicly-funded body. According to participants, private organisations should therefore not be allowed to keep datasets without explicit consent.
Practical Applications Suggested by Participants
For ‘good’ to be truly realised as the participants understood it, it was felt that a best practice framework, such as the Five Safes, should be a universally applied framework to allow the public to know that publicly collected data is being used in a way that they can trust.
As detailed earlier, greater awareness of data safeguards, and an understanding of why their data is collected, may yield more support for the public to share their data for research (see Clear Communication). To further improve the representativeness of research and decision-making, researchers should make more effort to consult with people who were less likely to disclose their information. This would help unheard voices inform the evidence being created, and progress understanding of how to address missing data.
Data custodians could explore how they could safely share and link more data to become evidence to be used for public good. This would enable policy and decision-making that centres on people’s needs, and was hoped to address the sense of disproportionality between the volume of data collected from people and the benefits experienced in everyday life.