The Power of Public Engagement in Shaping Our Work 

In our latest blog post, OSR Research Officer Nick discusses how public engagement shapes our work, specifically in relation to our research project about how individual members of the public use official statistics to make decisions that relate to their personal lives. This OSR research is delivered by the Policy Institute at King’s College London and Behavioural Insights Team and will be published in January (as discussed in a previous blog, “How do we use statistics in everyday life?”). 

Here at the Office for Statistics Regulation (OSR), we are increasingly recognising that involving the public in our work is crucial for ensuring statistics truly serve the public good. Because of that, we have begun exploring public engagement.  

Public engagement is important and impactful  

For us, public engagement is the process of involving and collaborating with those outside of government and statistics to share information, gather input and foster meaningful dialogue on issues that affect them. Public engagement is about more than just asking people for their opinions. It’s about making sure that the work we do is grounded in real experiences and needs. By involving the public, we can build on our own technical expertise to make sure our work is meaningful and useful to the very people we aim to serve. 

OSR has begun engaging the public on our research 

Recently, we spoke to a small group of people to find out what they thought of different aspects of our current research project exploring how people use statistics to make personal decisions. These discussions have shown us just how valuable public engagement can be in shaping our research to be more relevant and acceptable to the public. 

In our public engagement sessions, we invited members of the public to share their thoughts on different aspects of our research proposal. We met with them virtually multiple times over the course of the projects, so that we could tell them how their advice had helped us and continue to seek their views. This included discussing how they felt about the life decisions we wanted to explore in our research, such as choosing a school for a child. We also asked for their feedback on survey questions we plan to use in the next stage of our research. While this was a light-touch approach to public engagement (a small group that met online for only an hour each time), we still got a lot from these discussions.  

We found this approach very valuable 

Public engagement is a powerful tool that enriches our work and fosters a collaborative spirit between OSR and the public. While this is relatively new for OSR, our recent sessions have demonstrated the real value of this approach. From this experience of public engagement, we have three main reflections: 

  1. People wanted to be part of these discussions 
  2. Public contributors are full of useful suggestions 
  3. It is important to seek diverse perspectives 

People wanted to be part of these discussions  

Discussions about statistics and research have the potential to be dry. However, when we spoke to members of the public, they were enthusiastic about the research and appreciated the opportunity to contribute. For example, one of our public contributors said: 

“If someone said to me, you are going to be involved in some statistics research, it seems like whether I will be able to do that? As a lay member of  the public, maybe I don’t know?.. but it is easy… I would like to be involved in future opportunities.”

This quote shows how important it is to give people the opportunity to be involved in these types of discussions. 

Public contributors are full of useful suggestions 

Attendees provided valuable feedback on various ways to improve our research, which has already informed several key decisions in our project. For example, we originally planned to ask people about whether or not they used official statistics in choosing which school to send their child to. However, public contributors raised that not everyone actually gets to choose a school for their child: 

“You might be able to have a parent that will look and say this is the school I want my child to go to based on their needs, based on the statistics. But the actual chances of them getting that school is very slim. So there may be people that feel that, yes, I can look the statistics, but then I have an inability to choose.” 

Because of this advice, we changed the decision to be about which school to apply to, rather than which school to send a child to. This type of change helped our research be more relevant to members of the public. 

It is important to seek diverse perspectives  

Engaging with a diverse group of attendees allowed us to gather a wide range of viewpoints. For example, we heard from people whose first language was not English on how the statistics in the research could best be presented in a way that they could understand:  

“[Statistics are] in the English and I have language barrier, so how can I like utilise that… In the bar diagram it’s ‘OK this bar is high and this bar is low’ so it’s easy to understand.” 

This perspective led to sharing visual aids with participants in the first stage of our research rather than solely presenting a traditional statistical bulletin made up of prose. Doing so made our research more accessible to a broader audience, and allowed a wider range of participants to engage meaningfully. 

We plan to use public engagement more going forward 

The success of these public engagement sessions has reinforced our commitment to involving the public in OSR’s work. As part of this commitment, we are now part of the Public Engagement in Data Research Initiative (PEDRI). In addition, future research projects at OSR will build on the feedback received in this project, and we hope to undertake public engagement in key projects beyond research as well. In doing so, we will be aligning with our view that serving the public good means treating official statistics as public assets; this involves allowing the public to understand and engage with our work. Through public engagement about our work at OSR, we can ensure it is more transparent, trustworthy and responsive to the needs of the public. 

If you would like to learn more or even become a public contributor yourself, please contact us at research.function@statistics.gov.uk. We look forward to hearing from you.  

 

Just three words – the birth of a code

In our latest blog, Penny Babb, OSR’s Head of Policy and Standards, speaks about her experience with the Kenya National Bureau of Statistics in developing the Kenya Statistics Code of Practice

I am a self-confessed code-geek. I love talking about the Code of Practice for Statistics! I may be a tad boring at parties, but I have a passion for the Code that is hard to miss (and I hope is infectious).  

What better then, than to get together with a group of people, away from the distractions of work (or mostly), to talk about codes for a week! There truly was no escape for my audience as we met on the shores of Lake Naivasha in Kenya.  

It was such a privilege to work with the Director General and statisticians of the Kenya National Bureau of Statistics (KNBS), to guide them through the preparations of their own code of practice. This collaboration was part of the strategic partnership between the UK Statistics Authority (UKSA) and KNBS. It wasn’t about the UK Code, but an opportunity to share our experiences and to support KNBS as they wrote their first code – finding the message that they wished to share with their organisations and analysts across the statistical system to inspire good practice. KNBS is committed to ensuring official statistics remain a valuable resource for all stakeholders in the statistical system in Kenya and to demonstrating international best practice. It sees the code as being central to achieving these goals.  

One of the lessons I took from being involved in developing the second edition of the UK Code was the power of explaining it simply, in just three words. And as we have seen in our own recent review of the Code, our three words – trustworthiness, quality and value – are extremely effective in helping people producing and using official statistics to understand what matters. My goal was to help my Kenyan colleagues find their own three (or more!) words. 

The dedication and skill of the team of statisticians and leaders at KNBS was evident and central to completing their task of writing their code. The Kenyan code had to be one that was meaningful and inspiring in their context – not just to staff in KNBS, the national statistical institute for Kenya, but for all other organisations in their statistical system.  

In two week-long workshops, Caroline Wangeci (the then ONS strategic adviser to KNBS) and I led sessions to develop the Kenyan code of practice. Our codes are both grounded on the UN Fundamental Principles for Official Statistics, but each speaks to specific needs within our countries. 

The commitment and dedication of the Kenyan team was inspiring to witness and humbling to assist. Writing a code of practice to set standards for statistical production and dissemination is challenging. But doing so in concentrated bursts of effort, away from day-to-day responsibilities, allowed the team focused time to think through the key values and messages needed to inspire those involved in producing official statistics. After having spent a day diving deep into code matters, many of the participants then spent further hours catching up on their work, ensuring the smooth running of their statistical teams.  

KNBS has published the Kenya Statistics Code of Practice (KeSCoP). It is a remarkable achievement in its clarity and mission. As the Code highlights,  

“KeSCoP is anchored on three pillars, namely, Quality, Trustworthiness and Progressiveness.  Each pillar contains values and commitments that producers of statistics should commit to when producing and disseminating statistics.” [page 1, KeSCoP] 

KeSCoP is drawn from the principles and requirements of the Kenya Statistical Quality Assurance Framework (KeSQAF). The Code helps the drive for continuous improvements in statistical practice.  

“Compliance with KeSCoP gives producers of official statistics and the users confidence that published statistics are of high quality, have public trust and are produced by institutions and people that are progressive.” [page 2, KeSCoP] 

It echoes our UK Code but in a truly Kenyan way – they have defined their concepts and practices in the ways that speak to their application of the UN Fundamental Principles for Official Statistics.  

Quality has values for:

relevance; timeliness and punctuality; accuracy; coverage/comprehensiveness; methodological soundness; professional skills and competencies; and data sources

Trustworthiness has values for:

impartiality and equal access; confidentiality; accountability and transparency; and integrity and credibility  

Progressiveness has values for:

innovation; continuous improvement; user focus; and adequate funding

I love the clarity of each value and commitment, providing a clear steer for any producer on what it looks like to deliver official statistics to the required standard, with associated indicators or measures of success.  

We have much that we can learn from KeSCoP as we refresh the UK Code and seek to ensure its clear communication and accessibility. 

To find out more about UKSA’s international development work, see ONS’s published strategic plan for 2022–2025. 

 

The importance of separation: Ed Humpherson addresses questions raised by the Lievesley review

In our latest blog, Director General for OSR, Ed Humpherson, speaks about how OSR’s separation from ONS is communicated

Since Professor Denise Lievesley published her review of the UK Statistics Authority, most attention has been on the Statistics Assembly. The Lievesley review considered the work of the UK Statistics Authority, and its first recommendation was that the Authority should hold a Statistics Assembly every three years. The Assembly should elicit and explore the priorities for the statistical system with a wide range of users and stakeholders. And the first Assembly will present a fantastic opportunity to have a wide-ranging conversation about statistics and enable a discussion about the strengths and limitations of the current statistical landscape in the United Kingdom. So, it’s not surprising that this recommendation has generated a lot of interest. 

The Lievesley review raised other important issues. Some of these issues relate to OSR. In particular, she highlighted that OSR needs to improve how it communicates the separateness of its role from ONS.  

Separation matters to us. Indeed, when we look at official statistics, we start by considering the trustworthiness of the governance processes used by the statistics producers – by which we mean the mechanisms in place to ensure and demonstrate that the statistics are free from the vested interest of the organisation that produces them – that they are the best professional judgement of the statisticians. 

Similarly, it’s important that our decisions reflect our best professional judgement and that, in relation to ONS, we can make those judgements without giving any weight to ONS’s own organisational interests. 

We have several arrangements in place to secure our separation from ONS. But if people don’t appreciate or understand them, they are not working. And the Lievesley review made clear that we need to better explain the processes that assure this separation to our stakeholders and the public. That’s why today we are publishing a statement that sets out, in formal terms, the arrangements that underpin our separation from ONS. 

The key features of the statement are: 

  • a summary of the governance arrangements, under which we report our work to a separate Committee of the Authority Board made up only of non-executive members and me – that is, with no membership role for ONS staff. 
  • a summary of the arrangements for setting strategy and budget, which involve me and my team making our proposals directly to this Committee and to the Board, with no decision-making role for ONS staff 
  • a confirmation of my own personal reporting lines, which mean that I do not report to the National Statistician in any way, but directly to the Authority’s Chair; and that I have regular meetings with the Chair without the National Statistician or any senior ONS staff attending 

Please read the statement if you’d like to learn more about these arrangements. 

But I’ll close with Denise’s own words. The most important features of any governance set-up are the budget and the performance management. And on this, she was clear:  

The existence of a small number of misunderstandings by users also appear to perpetuate, such as that the Director General for Regulation is line managed by the National Statistician (he is not) or that the National Statistician controls the budget of the OSR (he does not). Nor does the National Statistician attend Regulation Committee meetings.” 

I hope the statement we are publishing today helps provide some reassurance and address the issues of perception identified by Denise’s review.  

Embedding the habit of intelligent transparency

In our latest blog, Director General for OSR, Ed Humpherson looks at the importance of intelligent transparency for Governments across the UK.

Intelligent transparency is one of the most important set of principles that we uphold. When statistics and quantitative claims are used in public debate, they should enhance understanding of the topics being debated and not be used in a way that has the potential to mislead.  To help those making use of data in public debate, we have set out our three underpinning principles of intelligent transparency. These principles demand that statistical claims and statements are based on data to which everyone has equal access, are clearly and transparently defined, and for which there is appropriate acknowledgement of any uncertainties and relevant context.

We have promoted intelligent transparency to the UK Government and the Governments of Scotland, Wales and Northern Ireland. And the Chair of the UK Statistics Authority, Sir Robert Chote, set it out clearly in his letter to political party leaders ahead of the 2024 general election. We have also made a number of interventions to support the principle of equal access.

 

Intelligent transparency in conference speeches

Equal access means that all statements involving statistics or data must be based on publicly available data, preferably the latest available official statistics where possible. Claims should not be made based on data to which ministers have privileged access, as this prevents the claims from being scrutinised and undermines confidence in official statistics.

We recognise that conference speeches by Ministers in particular can be difficult, as noted in this blog we published in September. Ministers want to highlight policy successes to their party members, but they do not have the input of the civil servants who would normally ensure that their statements use statistics and data appropriately.

In this context, we were recently made aware of a statement made by the Prime Minister, Sir Keir Starmer at the Labour Party Conference regarding immigration returns. The claim in question was that there has been “a 23 per cent increase in returns of people who have no right to be here, compared with last summer”. At the time the Prime Minister made this claim there was no Home Office data or statistics available in the public domain for the relevant time period to support this statement.

 

The importance of publishing data used in the public domain

Following the statement made by the Prime Minister, we engaged with the Home Office and we welcome the ad-hoc statistical release published by the Home Office, which provides the underlying data that relate to this statement. In most cases we would want to see the release of unpublished data very soon after the statement itself. But we do understand that sometimes, as in this case, providing the underlying data in a usable format may take longer. We consider that the approach taken by the Home Office is in line with the principles of intelligent transparency. It is a good example of how to resolve a situation when unpublished information has found its way into a speech.

 

Working with Government to ensure best practice

 We are using this example as an opportunity to re-emphasise the importance of intelligent transparency to a range of audiences, including Heads of Profession for Statistics, the wider Government Analysis Function, and the Government Communications Service. We have also discussed the issue with officials in Number 10, who have confirmed the importance they attach to appropriate use of data. We are happy to provide support and advice to departments on the principles of intelligent transparency for analysts, communication professions, special advisers, and any other colleagues who may benefit. To discuss this with us please contact us via regulation@statistics.gov.uk.

For all Governments, it is important to avoid the use of unpublished information in the public domain. We are always happy to work with officials and advisers to embed good habits of intelligent transparency as fully as possible.

 

Improving mental health services in Northern Ireland: The Regional Mental Health Outcomes Framework

In our latest guest blog, Oscar Donnelly, Lead for the Mental Health Outcomes Framework, discusses the outcomes-based approach to collect standardised mental health data and statistics using the new patient record system ‘Encompass’

In 2021, OSR’s Review of Mental Health Statistics in Northern Ireland highlighted a scarcity of robust mental health data for Northern Ireland, revealing significant data gaps. The lack of data on the outcomes of mental health services, in particular, made it impossible to answer questions people may have, such as whether services are effective in the treatment and care they provide.

These concerns were amplified in a 2023 Northern Ireland Audit Office (NIAO) report that pointed to a need for improved data around mental health services. NIAO emphasised that establishing an appropriate framework for measuring service outcomes would help support decision-making and the monitoring of services’ effectiveness. It was clear to us that addressing these concerns would be an ambitious but necessary undertaking.

Our Ambition to Improve Mental Health Data

Northern Ireland’s Mental Health Strategy 2021-2031 recognises the deficit in information on mental health care outcomes, similarly highlighting a need to “Develop a regional Outcomes Framework in collaboration with service users and professionals, to underpin and drive service development and delivery.” The strategy indicates that a Mental Health Outcomes Framework (MHOF) would support the use of evidence as the foundation for decision-making. Further, it should establish a comparable data set across the five Health and Social Care Trusts to allow us to measure performance and determine how best to improve health and social care services for the Northern Ireland public. As a result, in 2021, I was asked by the Department of Health (DoH) to lead on the development of a Mental Health Outcomes Framework for Northern Ireland. This is the story of how we developed and are implementing the framework and reflects on our progress and priorities so far.

Developing the Framework

During early engagement, I was advised by a psychiatrist colleague that for a Mental Health Outcomes Framework to be accepted and used in services, it would need to be clinically relevant to staff and meaningful to service users. So, it became clear to me that the primary focus of such a framework should be to improve outcomes for service users by supporting evidence-based clinical practice. Collecting data on overall service activity and performance, although important, is secondary to this objective.

A regional steering group comprising a range of stakeholders from across professions, mental health services, organisations and sectors was established to oversee the work. This included people with lived experience as service users and carers and representatives of both the DoH Information and Analysis Directorate and the Planning and Performance Group regional commissioner. Further input and review were provided by academics from Queens University Belfast and a consultant clinical psychologist with expertise in mental health data and statistics, outcome measurement and analysis and, most importantly, the practical use of outcomes measures within services.

The steering group met monthly, with a smaller working group drawn from the steering group meeting weekly to drive the work forward.

Three stakeholder workshop events were held in producing the framework: two for mental health service users and carers and one for mental health professionals drawn regionally from across service types and providers. A reference group of professional staff was also established to advise on the selection of appropriate measures for different mental health services.

The Outcomes-based Accountability approach

The Northern Ireland Government uses an outcomes-based accountability (OBA) approach to outcomes measurement, which we adopted for the framework. OBA measures are built around three core questions, which we used to inform the basic structure:

How much did we do?

We completed an audit that identified the poor quality of currently available data on community mental health services for adults and older adults. We found that the deficiency in these data relates both to the data being generated through a range of independently developed Trust IT systems and, more problematically, the variability of basic mental health service structures across Trusts. As a result, the routine collection of regionally consistent data is highly challenging.  To address these issues, we developed a “basic metrics” template to improve the quality and consistency of data on mental health activity, applying a regional reporting template to structure the information inputted by each Trust.

How well did we do it?

We considered both service users’ and carers’ experience and reflected on what is meaningful to them when they access mental health services. These measures were co-produced with service users and carers and then tested in services. Separate questionnaires were developed for service users and for carers, with adapted versions for each of the Child and Adolescent Mental Health Services (CAMHS) and perinatal mental health services.

Is anyone better off?

Working with clinicians from across services and Trusts, we identified a range of appropriate and validated tools for measuring clinical outcomes. These included universal measures of emotional wellness to be used across all mental health services and a portfolio of condition and service specific measures. This portfolio reflected that people access mental health care for many reasons, so services should measure outcomes across a wide range of individuals presenting with varying needs. We identified a total of 47 measures of clinical outcomes for the MHOF.

Implementing the Framework

We engaged widely with international mental health outcomes programmes, particularly in Australia and Scotland, to learn lessons and maximise the success of the framework. This helped us identify that the implementation phase is where outcome measurement initiatives are most at risk of either failing or underachieving. The plans for implementation and operation need to inform each step of the development of a mental health outcomes framework.

Most importantly, implementation must be seen as more than just a technical exercise; one of the biggest challenges to success is ensuring that mental health practitioners and services value and routinely use the outcome measures in their day-to-day practice.

Ensuring that outcomes measures are accessible and easy to use in busy clinical settings is therefore critical. It is hugely opportune, then, that Northern Ireland is currently investing in a new regional electronic patient record system: Encompass. This system will replace the various electronic patient record systems across Trusts with one that creates a single digital care record for every person in Northern Ireland receiving care. In this way, Encompass will be an important enabler to ensure successful implementation.

Where are we now?

The co-produced regional MHOF was approved for implementation by the local health minister in October 2022.

We are currently in Stage 1 of implementing the framework. Our aim, during this phase, is to embed the framework measures during the build of the Encompass system and establish the capacity to report robust regional data on mental health service activity. We plan to complete this work by early 2026.

Work is progressing, with approximately 60% of measures having now been quality assured, license arrangements sorted, and formats digitised onto the Encompass system. We are also engaging with mental health and informatics staff across Trusts to implement basic activity metrics.

A challenge around capacity has also emerged. As the Encompass system starts to go live in each Trust, Encompass systems staff will be confronted with competing priorities and naturally, they prioritise supporting the clinical functionality of the new system over other development work.

What’s next?

Stage 2, which we hope to start in 2025, will involve supporting mental health professionals and services in using the outcomes measures embedded in the Encompass system during Stage 1. This will require a phased programme of engagement with teams and clinicians and regional clinical networks to select appropriate measures for similar services across Trusts and to regionally operationalise these in a robust and consistent way. It will involve training and supporting teams and services across Trusts to integrate the use of outcomes measures within their clinical practice. It will also require engagement with service users to test and evidence its acceptability and to determine how best to support them.

In Stage 2, we will explore international examples of services that have successfully implemented outcome measurement into their clinical practices. Our aim is to embed the framework with selected services to support outcomes reporting by late 2026. The second phase will also include the data collection and reporting of service user and carer experience, alongside the development of a patient portal on Encompass.

Implementing and operating an ambitious framework such as this requires dedicated resourcing. As such, a dedicated regional programme lead will be recruited, with this role expected to be filled by winter 2024/25. It is also hoped that MHOF champions will be appointed in each of the five Health and Social Care Trusts. Though these roles are still awaiting funding, they will likely be critical to the success of the framework and its adoption by services.

Funding has been secured for a Regional Business Unit, who will work with the MHOF programme lead to ensure the development of the OBA reporting capacity in the Encompass system. Dashboards will report routine measurements as standard. They will also have the capability to further analyse MHOF data against other demographic and clinical information, increasing our understanding of how we can best respond to the needs of service users.

The future of outcomes measurement

Implementing outcomes measurement in mental health services is a long-term strategic commitment that requires ongoing engagement, support and practice development.

The framework’s multi-faceted approach to improving our information on mental health services will enable services to develop and adapt in a changing landscape where resources are limited and demand is growing. It will help us assess the effectiveness of services so that we can maximise their benefits, address issues such as long waiting lists and improve the quality of care and outcomes.

We expect to begin collecting statistics from the Encompass system from Spring 2026, which should address the issues around comparable data.

For further information on this work, please contact Oscar Donnelly, Chair of the Mental Health Outcomes Framework: Oscar.Donnelly11@outlook.com.


Related Links:

Review of mental health statistics in Northern Ireland

Learning lessons from statistics: My experience as an intern at OSR

When I was placed in OSR for an eight-week internship, as part of the Civil Service summer internship scheme, I didn’t know what to expect. I study English literature at Oxford University, so statistics are not a part of my day-to-day life – or so I thought. OSR has shown me that statistics are the ‘lifeblood of democracy’, informing the decisions we make every day.

OSR casework and election lessons learned

OSR’s vision is simple – statistics should serve the public good. I quickly learned that for statistics to serve the public good, they must be communicated to that end. OSR’s work is rooted in communication. The pillars of the Code of Practice – Trustworthiness, Quality and Value – ensure the statistical story is told to users transparently.

While working with the casework team, I saw the importance of communication at ground level from the user complaints and queries that OSR receives. I shadowed some cases with different subject domains to better understand how and when they intervene when statistical claims are made.

The ‘lessons learned’ work I did, which looked at how OSR’s casework and impact during the pre-election period might inform future interventions, was a highlight of my placement. I posed questions to the election response team and led the discussion on the pre-election casework. I also reviewed web analytics and compared user view and engagement spikes on our ‘what to look out for’ webpages, which addressed common statistical claims on education or the health service, with media articles that came out on those days, highlighting the alignment of our explainers with public interest. The project recommended improvements to the casework process, such as creating a list of useful data sources and using explainer statements to address commonly contested claims beyond the election period.

In summary

Across all my projects, which also included contributing to the blog-writing process with the communications team and working on a post-election survey for Heads of Professions with the research function, I’ve encountered the core spirit of collaboration that underpins OSR. No more than 50 people work here, but our outputs convey the strength of this team. Here, everyone is trustworthy, valued and working to the highest quality; OSR embodies the very code it regulates.

Commenting on conference speeches

In our latest blog Head of Casework, Elise Rohan, talks about claims made during political party conferences and our expectations of producers in this period…

Every autumn, political parties in the UK host their annual conferences in what is known as ‘party conference season’. We were recently asked about our approach to intervening in speeches and statements made in these conferences under our responsibility to protect the role of statistics in public debate.

As with any concerns raised with us, our approach is guided by our interventions policy. It sets out how we use our voice to stand up for statistics, reporting publicly where we consider there is a likelihood of the public being misled on an issue of significant public interest.

We recognise that party conferences, much like election periods, require careful judgement about when to intervene. We are not moderators of political debate, and we understand that it is part of the democratic process for political parties to draw on a wide range of sources, including statistics, to persuade potential voters. Our focus is on ensuring statistics are not being misrepresented in these statements and speaking up where we identify the potential for the public to be misled.

Ahead of the 2024 UK General Election, we carried out dedicated monitoring of party manifestos, debates, speeches, and interviews given by members of political parties. While we do not take this approach to monitoring statements made during party conference season, our expectations for producers during this period remain the same.

  • We do not expect producers to respond or publish an ad-hoc report for general statements made in a party conference. For example, where politicians make generalised comparisons of track records between political parties. This would not be proportionate, nor appropriate given conference speeches are political and should not involve statistics producers.
  • However, in instances where a statement makes specific reference to statistics which aren’t in the public domain, we would expect producers to follow our intelligent transparency guidance for responding to unplanned releases of data.

For those seeing these statements, the most important thing to help combat the potential for statements to mislead is to develop the skills to critically challenge what you see and get in touch with us if you have concerns.

Data sharing and linkage for the public good: breaking down barriers

Following the publication of OSR’s report on enabling greater data sharing and linkage for research and statistics for the public good, Head of Development Helen highlights initiatives helping to overcome barriers to data sharing, access or linkage across government. These initiatives could benefit analysts and organisations within and beyond government.

Our findings from our report on enabling greater data sharing and linkage are clear: sharing and linking of datasets is still the exception, rather than the norm.

Barriers to progress persist. These include a lack of leadership in championing a cross-government approach to data sharing, uncertainty around funding and resourcing, a lack of clarity about data access processes, and nervousness about the safety of data sharing and social licence for data sharing and linkage. We call for leadership, including political leadership, and an approach that starts with thinking about the benefits of data sharing and linkage to society, then supporting the removal of barriers to deliver these benefits, in a secure way. We have reiterated this sentiment more recently, in our statement on how Government can support the statistics system to be at its best now and in the future.

Examples of initiatives enabling data sharing and linkage

Among all these firm messages, the thing I want to make sure isn’t lost is that many teams and organisations are spearheading impressive initiatives that are helping to overcome barriers. These will benefit analysts and organisations within and beyond government who want to share, access or link data. Below I highlight initiatives we heard about while researching our report, grouped under the themes of our findings.

Public engagement and social license

It is important to obtain social acceptability for data sharing and linkage. Engaging the public to track attitudes towards data sharing and linkage remains vital, so that the social licence is understood and maintained. But there can be a lack of understanding about how to carry out public engagement in a meaningful way.

  • The Public Engagement in Data Research Initiative (PEDRI) is developing good practice guidelines to help researchers and other analysts conduct public engagement in data research and statistics.
  • The Public attitudes to data and AI tracker survey published by the Department for Science, Innovation and Technology (DSIT), provides insight into how society views data use. The 2024 survey, due to be published in December, will include specific questions on public attitudes to data sharing and linkage.
  • The Administrative Data Research UK (ADR UK) Learning Hub contains useful resources on public engagement in practice and brings together information on skills and resources for those using administrative data and data linkage.

The public’s key concern regarding data use remains data security. It is likely that many people’s attitudes towards data sharing and linkage continue to be influenced by concerns around data security.

Good leadership, and the skills and availability of staff

At every step of the pathway to share and link data, the people involved, and their skills and expertise, are key to projects’ success or failure. A lack of awareness, priorities and capability of people involved in decision making and development, including senior leaders, analysts and those in data governance roles, can impose barriers.

  • Programs like the Digital Excellence Programme, which trains civil servants in data literacy and AI can strengthen data literacy among leaders in government.
  • The Data Science Campus, part of the Office for National Statistics, has launched One Big Thing – a new learning initiative, which in 2023 was focused on strengthening civil servants’ data skills. This course is now available for all civil servants on Civil Service Learning.
  • The Data Linkage Champion Network provides a forum for civil servants of all grades and from across government to discuss and promote data linkage.

Hesitancy around data sharing and linkage due to potential overcaution remains a major barrier. The issue is not always that government departments disagree over whether to share data. It can also be that everyone agrees to a data share in principle, but it does not advance because of the complexities of the process.

  • The Central Digital Data Office (CDDO) has established the Data Sharing Network of Experts (DSNE) to help departments deal with questions or differences of opinion around data sharing.

Non-technical processes that govern how data sharing and linkage happens across government

When an external researcher or government analyst wishes to access data there are several steps they follow. Firstly, they must know the data they wish to access and where they are held. Secondly, they must establish the legal route to the data and, finally, they must gain access to that data. Each of these steps can pose barriers.

  • The UK Statistics Authority (UKSA) has published an online resource answering frequently asked questions about the Digital Economy Act (DEA, 2017) Research power and the Statistics and Registration Service Act (SRSA, 2007). It covers what data can be accessed via the Research power and SRSA, accredited processing environments and who can access data and for what purposes.
  • ADR UK’s online Learning Hub includes information on the DEA (2017) for researchers wishing to access administrative data under this legislation. It has a slide deck produced by the UKSA that explains what the DEA 2017 Research power allows for and contains a visual map of the data access journey.
  • The CDDO is developing a data marketplace to improve the discoverability of data within government. The marketplace allows those within government to find out what data are held and how data can be accessed.
  • ADR UK has created a searchable public metadata cataloguethat contains information about the datasets held across the ADR UK partnership. It includes a webpage for each dataset with links to information on how to access the data, as well as a description of the dataset.
  • HDR UK has created a searchable public metadata catalogue which contains information from over 850 different health-related datasets across the UK.
  • The Pan-UK Data Governance Steering Group was established by the UK Health Data Research Alliance, convened by Health Data Research UK (HDR UK), to simplify and streamline data access governance processes. The Steering Group co-developed and published Transparency Standards with HDR UK’s Public Advisory Board (PAB) to guide good practice.
  • Research Data Scotland runs a Researcher Access Service for those wishing to access public data in Scotland. It publishes a data access overview describing the stages of data access – from discovering data to receiving access.

Technical specifics of datasets, and the infrastructure to support data sharing and linkage

Technical challenges can pose significant barriers to effective and efficient data sharing and linkage, including problems presented by the recording of accurate metadata.

  • CDDO is leading efforts to help departments identify their Essential Shared Data Assets (ESDAs) – data assets that are critical from a cross-government perspective. This includes the creation and maintenance of accurate metadata, which will improve the documentation of data held by government.

Alongside initiatives that could have widespread benefits for eroding barriers, our report highlights examples of specific analytical projects and programmes that have successfully used linked data to deliver impactful analysis. These case studies offer a window on what can be achieved when data are shared or made accessible.

We want to hear from you

OSR is always delighted to hear about and champion work that demonstrates or enables effective data sharing, access and or linkage. If you have a case study or would like to discuss our work in this area, please get in touch: regulation@statistics.gov.uk.

What do you do with a degree in Philosophy?

Our exploration of misleadingness brought us into insightful conversations with philosopher and PhD student, Kyle Adams, from the University of Waterloo. Together, we revisited and refined our understanding of misleadingness. In our latest guest blog, Kyle shares his experiences and perspectives from the past year working with us on this intriguing topic.


My name is Kyle Adams, and for the last two (and a bit) years, I have been a graduate student in the University of Waterloo’s Department of Philosophy. Even after a few years of practice, it’s still strange for me to write it out explicitly like that. I, like most people, spent much of my life with a very particular image of what a philosopher is: an old man, probably with a big bushy beard, almost certainly somewhere in Ancient Athens.

I am a man with a beard, but I’m young, and my chances of travelling back three thousand years to chat with Socrates and Plato in the agora are probably not very good. My apparent distance from the popular idea of a philosopher invites a question that I must have heard a thousand times since starting my schooling: what do you do with a degree in philosophy? It’s a question that working with the Office for Statistics Regulation (OSR) has helped me to answer.

Asking Questions

Remember our imagined philosopher from the first paragraph? When he was walking around Ancient Athens, he was probably asking a lot of questions. An essential part of any philosophical method is to question what we are presented with. Sometimes this means the evidence of our senses, and sometimes it means the nature of some important quality—wisdom, or justice. At OSR, I asked a ton of questions. Why is this definition about using statistics instead of producing them? Can we even separate the use of statistics from their production, or are the two practices too interwoven for that? What counts as an audience for the purposes of a statement? What happens if different people believe conflicting things on the basis of shared evidence? How can we avoid going beyond our remit?

Some of these questions are more obviously philosophical than others, but they all count in my books. Sometimes, working with a philosopher can feel like talking to an especially curious child: we are both equally happy to keep asking ‘Why?’ until somebody cuts us off. But despite the occasional frustration (and believe me, philosophers sometimes feel it too, when we talk to each other), offering this series of explanations is helpful.

Every question you answer, especially if it makes you sit and reflect for a while, gives you a better understanding of what you think. If you find an answer, excellent! If you don’t, even better. Now, you’ve identified something that you don’t yet understand—this might be an unsupported belief, or something you just hadn’t thought about yet, or sometimes something you’ve taken for granted that could be improved. This practice of questioning is a skill that philosophers train and develop, but it’s something that everybody can employ in their own lives, both personally and professionally.

“Test Our Thinking”

Working with OSR has been a very rewarding experience for me. When my supervisor, Jenny Saul, first introduced me to Ed Humpherson and OSR, we talked about me consulting on the working definition of ‘misleading statistics’ they had developed. Over the last few years, Jenny, along with a handful of other philosophers of language, had been working with OSR to help develop a definition of what it means for a use of statistics to be misleading. This work resulted in a thinkpiece exploring various ways that it might be helpful to think of misleading, and later, a follow-up expanding on and refining the original ideas. These thinkpieces contained the definitions that Ed and Jenny thought it would be helpful to have me consult on. “We’d like someone to test our thinking,” is a phrase that Ed employed frequently, and it became, I think, the motto of this research placement.

As the weeks went by, it turned out that OSR was doing an awful lot of thinking for me to test. That one specific consultation turned into a multi-faceted research program: I had the opportunity to consult on specific cases of potential misleadingness, to collaborate on research into the meaning of ‘serving the public good’, and ultimately, to organize a workshop for a group of philosophers to discuss new challenges posed by shifting communication media, and they even let me write this blog!

Through all these avenues of research, the original question—what it means for a use of statistics to be misleading—remained a major focus for me. Through many gradual revisions, and lots of collaboration, we identified some areas where it seemed possible to improve upon the existing work. We looked at the work we wanted a definition of misleadingness to be able to do, and concluded that it should be action-guiding for OSR. We clarified some ambiguities about what kinds of beliefs OSR should be concerned about, and how best to identify and distinguish the audiences of statements.

We also spent a lot of time addressing a shortcoming of earlier versions of the definition—how to distinguish between a use of statistics that’s misleading, and one that is more straightforwardly incorrect. For instance, if I just make up a completely wrong statistic to tell you, I have done something more than merely mislead you. Both of these cases fall within OSR’s remit, but they require us to consider different factors, and approach regulatory responsibilities in different ways. The team at OSR has been wonderfully supportive and collaborative in diving into these problems, always welcoming my questions, even those about the pickiest philosophical technicalities.

OSR has been a fantastic place for me to put my work to the test in the real world. As often as I was testing OSR’s thinking, OSR was testing mine—many of my questions were met with answers that forced me to adapt my own thoughts on the matter at hand. My work with OSR has been hugely informative for my research moving forward, and has also been quite a lot of fun!

My thanks to OSR across the board, and especially Elise, Kirsty, and Ed, for welcoming a philosopher into your midst to poke and prod at your thinking and see what happens. You have given me a lot to think about, a thoroughly enjoyable research experience, and also a brand-new, and unexpected, answer to an old question. What do you do with a degree in philosophy? Apparently, if you’re lucky enough to find the right people, you work in statistics regulation!


Related reading:

What do we think about Misleadingness? A follow up thinkpiece (coming 09 September 2024)

Whose line is it anyway? Why the misleading presentation of statistics cannot be dismissed as just a matter of opinion

“Statistics may be used to provoke and to challenge the status quo; but Ofcom is entitled to insist – in the public interest – that they should not be misused so as to mislead”. Not our words but the words of a recent High Court Judgement following an intervention by Ofcom to challenge the presentation on GB News of vaccine statistics produced by the UK Health Security Agency (UKHSA).

When concerns are raised to OSR about the communication or presentation of statistics, we are often asked to comment on whether we judge something to be misleading.​ Misleadingness, how to define it and what this means in our context as a statistics regulator, is something that we routinely come back to.

Being able to intervene and publicly challenge the misuse of statistics is a crucial part of meeting our statutory objective of ‘promoting and safeguarding the production and publication of official statistics that serve the public good’ under the Statistics and Registration Service Act 2007. When statistics are misused, it damages public confidence in data and those communicating the messages.

As this High Court Judgement shows, we are not alone in tackling misuse of data and misleading communication. The High Court supported Ofcom’s right to intervene and emphasised that the misleading presentation of statistics cannot be dismissed as just a matter of opinion.

“The purpose of both the caveat on effectiveness and the contextual statement was to sound a warning against the simple and undifferentiating comparison of groups. Yet, an undifferentiating comparison was undertaken on the Show”. – High Court Judgment

This sets a valuable formal legal precedent and echoes many of OSR’s messages around the importance of pre-empting statistics being used in a way that has the potential to mislead. We often set recommendations to producers in our regulatory work that state the importance of highlighting information on quality, caveats and context alongside the statistics to support appropriate use. But it’s just as important to do these things to prevent misuse of the data – which UKHSA had done in this case.

“We present data on COVID-19 cases, hospitalisations and deaths by vaccination status. This raw data should not be used to estimate vaccine effectiveness as the data does not take into account inherent biases present such as differences in risk, behaviour and testing in the vaccinated and unvaccinated populations.” COVID-19 vaccine surveillance report, UKHSA

The Ofcom case demonstrates why our advice and the Code of Practice for Statistics should not be thought of as a tick box exercise. Preventing misuse of data is an essential part of protecting the value and impartiality of statistics which in turn serve the public good. When communicating quality, and uncertainty in particular, it can be tempting for statistics producers to fall back on generic wording such as ‘users should exercise caution’ but as my recent blog highlights, these statements don’t go far enough in supporting use and interpretation of the statistics.

It is much harder to react quickly to and debunk misuse once it has happened, especially in an age of social media, so effort should be made to prevent it happening in the first place. We will continue to work on identifying best practice around how to prevent misuse and to support producers of statistics to communicate statistics through different means, so that they can be understood by audiences with varying levels of understanding.

A good rule of thumb is to consider how a reasonable person would interpret the statement being made and ensure that this is not likely to be misleading in the absence of additional information.