Quality under challenge: Regulating statistics and data from the Labour Force Survey

In our latest blog, Head of Assessment, Siobhan, discusses the challenges of regulating statistics and data from the Labour Force Survey.

Many concerns have been raised about the quality of the Labour Force Survey (LFS) produced by the Office for National Statistics (ONS), the statistics produced from it and the challenges ONS faces in delivering the online Transformed Labour Force Survey (TLFS) that will replace it.  

So, what are we, as the Office for Statistics Regulation (OSR), doing about it? Our key interventions include: 

  • removing the LFS’s status as accredited official statistics to signal quality concerns to users 
  • where there are quality concerns, removing the accreditation of statistics based on LFS data and associated Annual Population Survey (APS) data. To date, we have removed the accreditation from 14 statistical outputs 
  • setting requirements for ONS to improve its communication and engagement, and to consider the lessons that can be learnt from the LFS 
  • reviewing ONS’s work to develop an online replacement for the LFS, the TLFS 

What is the Labour Force Survey?

The LFS is the main household survey providing labour market information in the UK. It is ONS’s largest regular household survey, outside of the census, and has been running since the 1970s.  

Statistics calculated using the LFS (and the related Annual Population Survey, also produced by ONS) are vital to understanding the world we live in. These statistics are used to estimate the number of people employed, unemployed and economically inactive across the UK, in different parts of the UK, and for different groups, such as young people, older people and those with disabilities. These statistics also inform key economic decisions, such as interest rates set by the Bank of England and government tax and spending. 

De-accrediting statistics derived from the Labour Force Survey 

OSR’s primary regulatory role is to independently review statistics to ensure that they comply with the standards of Trustworthiness, Quality and Value in the Code of Practice for Statistics. When they meet these standards, we give them ‘accredited official statistics’ status, and when statistics fall short, we can also remove the accreditation. We may decide on this course of action for several reasons. These could be related to concerns around the quality of the data sources used to produce the statistics, where user need is not being met or where substantial changes to the data sources and methods require us to conduct a review to ensure the quality of the data is such that they continue to be applicable for their intended use.  

As we highlighted in our 2020 assessment, the response rate for the LFS has been steadily declining, and we called on ONS to address this issue and share any relevant information with users. ONS continued to develop its plans for the TLFS, which aims to address the shortcomings of the LFS. 

The long-standing response rate challenges facing the LFS were exacerbated by the COVID-19 pandemic. These issues then became acute when the boost to enable pandemic operations was removed in July 2023. Following this, ONS had to suspend publication of its estimates of UK employment, unemployment and economic inactivity based on LFS data. 

We removed the accreditation from LFS-based estimates and datasets in November 2023. We have also removed the accreditation from other outputs based on data from the APS. The APS is based on responses to wave 1 and wave 5 of the LFS plus a boost sample. 

Monitoring ONS’s work to improve the LFS

When ONS reintroduced LFS-based labour market statistics in February 2024, we carried out a short review of these statistics. In August 2024, we carried out a follow-up review to check the progress made against the requirements set out in our initial report.  

We identified four outstanding requirements, which focus on: 

  • communicating updates on both the LFS and TLFS in one place that users can easily access 
  • improved communication around the uncertainty in the data and what this means for the use of these data 
  • the publication of more-detailed information about the principles and quality criteria that ONS will consider in making further LFS improvements and the transition to the TLFS 
  • the publication of more-detailed information about ONS’s plans for improving the LFS from now until the transition to the TLFS and for transitioning to the TLFS  

We will continue to closely monitor ONS’s work to improve the LFS and plan to report on progress against these requirements next year. 

Reviewing ONS’s work to transform the LFS 

Over the last few years, ONS has been developing a different version of the LFS using an online-first multimode approach (that is, online first, followed by telephone and use of face-to-face interviewers to encourage households to participate). 

Recognising the significance of these statistics for government and economic decision-making, rather than wait to review the final statistical outputs, we have carried out regulatory work throughout their development. The aim of this work is to share early regulatory insights to help ONS in ensuring the new survey meets the standards of Trustworthiness, Quality and Value, in line with the Code of Practice for Statistics.  

We have carried out our review in phases, with each focused on the most relevant elements of the Code. The aim is to assess the statistics produced from the survey against all parts of the Code once the transformed survey is fully implemented.  

The first phase (which started in April 2022) focused on the design and development work ONS had planned before transitioning to the new survey approach. We published our initial findings in November 2022. In July 2023, we published an updated letter and progress report following phase two of our review. We are now entering the third phase of this work, which focuses on ONS’s engagement with users, its communication about its planned work and how it is assessing the quality and readiness of its transformed survey. 

We plan to publish the outcome of phase three of our review of ONS’s LFS transformation in early 2025. 

What’s next? 

A key theme throughout our regulatory work on the LFS and TLFS has been the need for improved communication and clarity, specifically around plans, sources of uncertainty and quality criteria for the transformed LFS. We also called on ONS to identify what lessons can be learnt from the LFS to more effectively and transparently manage and pre-empt quality issues in the future. So, we were pleased to see the comprehensive communication of ONS’s plans and activities in its letter to the Treasury Select Committee and in the update it published on 3 December. We also welcome its engagement through the stakeholder panel and the external methodological input it has sought from Professor Ray Chambers and Professor James Brown.  

We recognise there is still more to do. We will continue urging ONS to provide regular, clear, and comprehensive updates for users, as well as to seek challenge and input from key users and experts to ensure the future production of statistics that meet user needs. 

We are also working to understand to what extent response issues are impacting other household surveys used across the statistics landscape. We have asked ONS to consider whether the issues, concerns, and lessons it has learnt from the LFS apply to its other surveys. 

We will also carry out our own lessons learnt review, focusing on how we can best apply the Code of Practice as a tool to support transformation activities.

The Power of Conversation

In our latest blog, Head of OSR Ed Humpherson discusses our consultation on a revised Code of Practice, which  is open for comments until 14 February 2025. Read more about the consultation and how to have your say here. 

I have been asking myself why it is only now that I am writing a blog on our consultation on a revised Code of Practice, several weeks after its launch.

The consultation is big news for OSR and for the UK statistical system: the Code is our foundational set of principles, our conceptual framework, our guiding light. And it’s not as if we are proposing some mere tidying-up measures, the sort of pruning and weeding that a good gardener does to maintain their garden. We are proposing some significant landscaping changes – particularly to the structure and presentation of the Code.

Perhaps the answer comes down to my observation that most endeavours in the world of statistical regulation depend on, and are enriched by, conversation. OSR’s best work – our most impactful reports and interventions – is effective because of our engagement and interaction with users of statistics, both expert and not, and with the people who produce the statistics.

To give two examples: first, our annual state of the statistics system report is not dreamt up by us in a meeting room; it builds on a whole series of conversations across the statistical system, both with users and producers. Second, our assessments of individual statistics draw heavily on engagement with users; take a look at our recent assessment of ONS’s Price Index of Private Rents to see this in action.

Launching a consultation is not an end point in itself. It is an invitation to other people to share their thoughts, reflections and criticisms.

Moreover, the part of the work I particularly enjoy is not the sense of achievement on the day of publication. It’s hearing all the subsequent reactions and comments, and joining in the discussions that ensue.

That’s why I was so happy last week to participate in a joint event between OSR and the Royal Statistical Society (RSS) to discuss the new proposed Code. We heard a range of interesting and thought-provoking reactions, such as those from Paul Allin, Honorary Officer of the RSS for National Statistics, on the importance of recognising the public role of statistics; from Ken Roy, an independent researcher and former head of profession in a government department, who highlighted that the Code is the glue that holds together the large and complex UK statistics system; and from Deana Leadbeter, Chair of the RSS Health Statistics User Group, who welcomed the ambition of a more digestible Code for a wider audience. And we had some excellent questions from the audience on topics ranging from the limits to trustworthiness (from a colleague in the Hungarian national statistical institute) to the importance of simplicity.

These productive conversations are why I’m looking forward to the debates and dialogues around the new Code in the coming months – including those with the Market Research Society and the Forum of Statistics User Groups.

I want to hear people’s reactions to the new Code, including their views on:

And I want to hear a wide range of other thoughts – not just about the things that we want to highlight, like those three bullets above – but the things we have missed.

This emphasis on engagement and conversation is not only a core value for OSR. It’s also central to the Code of Practice itself. The new Code that we are proposing sets even clearer and firmer requirements for statistics producers in how they should engage with their users and transparently communicate how they have produced their statistics, and what their statistics do (and don’t) mean.

So, back to the question at hand: why didn’t I write this blog until now? It’s this: for me, the day the consultation is published is not always the best day to publish a blog. Instead, it can be better to wait until we’ve started to hear some views. Or, to put it simply: communication shouldn’t be about broadcasting a fixed view. Instead, it’s all about the power of conversation.

Read more about the Code consultation and how to have your say here. 

[1] What is it with me and gardens? I used to do a presentation all about walled gardens – how official statistics can’t be a walled garden, pristine but closed off from the world. They need to be open and accessible. Now, as then, I reach for a garden metaphor. It can’t be that I use these gardening analogies because I myself am an adept and successful gardener. I mean, you should just look at my own garden to realise that.

 

The Power of Public Engagement in Shaping Our Work 

In our latest blog post, OSR Research Officer Nick discusses how public engagement shapes our work, specifically in relation to our research project about how individual members of the public use official statistics to make decisions that relate to their personal lives. This OSR research is delivered by the Policy Institute at King’s College London and Behavioural Insights Team and will be published in January (as discussed in a previous blog, “How do we use statistics in everyday life?”). 

Here at the Office for Statistics Regulation (OSR), we are increasingly recognising that involving the public in our work is crucial for ensuring statistics truly serve the public good. Because of that, we have begun exploring public engagement.  

Public engagement is important and impactful  

For us, public engagement is the process of involving and collaborating with those outside of government and statistics to share information, gather input and foster meaningful dialogue on issues that affect them. Public engagement is about more than just asking people for their opinions. It’s about making sure that the work we do is grounded in real experiences and needs. By involving the public, we can build on our own technical expertise to make sure our work is meaningful and useful to the very people we aim to serve. 

OSR has begun engaging the public on our research 

Recently, we spoke to a small group of people to find out what they thought of different aspects of our current research project exploring how people use statistics to make personal decisions. These discussions have shown us just how valuable public engagement can be in shaping our research to be more relevant and acceptable to the public. 

In our public engagement sessions, we invited members of the public to share their thoughts on different aspects of our research proposal. We met with them virtually multiple times over the course of the projects, so that we could tell them how their advice had helped us and continue to seek their views. This included discussing how they felt about the life decisions we wanted to explore in our research, such as choosing a school for a child. We also asked for their feedback on survey questions we plan to use in the next stage of our research. While this was a light-touch approach to public engagement (a small group that met online for only an hour each time), we still got a lot from these discussions.  

We found this approach very valuable 

Public engagement is a powerful tool that enriches our work and fosters a collaborative spirit between OSR and the public. While this is relatively new for OSR, our recent sessions have demonstrated the real value of this approach. From this experience of public engagement, we have three main reflections: 

  1. People wanted to be part of these discussions 
  2. Public contributors are full of useful suggestions 
  3. It is important to seek diverse perspectives 

People wanted to be part of these discussions  

Discussions about statistics and research have the potential to be dry. However, when we spoke to members of the public, they were enthusiastic about the research and appreciated the opportunity to contribute. For example, one of our public contributors said: 

“If someone said to me, you are going to be involved in some statistics research, it seems like whether I will be able to do that? As a lay member of  the public, maybe I don’t know?.. but it is easy… I would like to be involved in future opportunities.”

This quote shows how important it is to give people the opportunity to be involved in these types of discussions. 

Public contributors are full of useful suggestions 

Attendees provided valuable feedback on various ways to improve our research, which has already informed several key decisions in our project. For example, we originally planned to ask people about whether or not they used official statistics in choosing which school to send their child to. However, public contributors raised that not everyone actually gets to choose a school for their child: 

“You might be able to have a parent that will look and say this is the school I want my child to go to based on their needs, based on the statistics. But the actual chances of them getting that school is very slim. So there may be people that feel that, yes, I can look the statistics, but then I have an inability to choose.” 

Because of this advice, we changed the decision to be about which school to apply to, rather than which school to send a child to. This type of change helped our research be more relevant to members of the public. 

It is important to seek diverse perspectives  

Engaging with a diverse group of attendees allowed us to gather a wide range of viewpoints. For example, we heard from people whose first language was not English on how the statistics in the research could best be presented in a way that they could understand:  

“[Statistics are] in the English and I have language barrier, so how can I like utilise that… In the bar diagram it’s ‘OK this bar is high and this bar is low’ so it’s easy to understand.” 

This perspective led to sharing visual aids with participants in the first stage of our research rather than solely presenting a traditional statistical bulletin made up of prose. Doing so made our research more accessible to a broader audience, and allowed a wider range of participants to engage meaningfully. 

We plan to use public engagement more going forward 

The success of these public engagement sessions has reinforced our commitment to involving the public in OSR’s work. As part of this commitment, we are now part of the Public Engagement in Data Research Initiative (PEDRI). In addition, future research projects at OSR will build on the feedback received in this project, and we hope to undertake public engagement in key projects beyond research as well. In doing so, we will be aligning with our view that serving the public good means treating official statistics as public assets; this involves allowing the public to understand and engage with our work. Through public engagement about our work at OSR, we can ensure it is more transparent, trustworthy and responsive to the needs of the public. 

If you would like to learn more or even become a public contributor yourself, please contact us at research.function@statistics.gov.uk. We look forward to hearing from you.  

 

Just three words – the birth of a code

In our latest blog, Penny Babb, OSR’s Head of Policy and Standards, speaks about her experience with the Kenya National Bureau of Statistics in developing the Kenya Statistics Code of Practice

I am a self-confessed code-geek. I love talking about the Code of Practice for Statistics! I may be a tad boring at parties, but I have a passion for the Code that is hard to miss (and I hope is infectious).  

What better then, than to get together with a group of people, away from the distractions of work (or mostly), to talk about codes for a week! There truly was no escape for my audience as we met on the shores of Lake Naivasha in Kenya.  

It was such a privilege to work with the Director General and statisticians of the Kenya National Bureau of Statistics (KNBS), to guide them through the preparations of their own code of practice. This collaboration was part of the strategic partnership between the UK Statistics Authority (UKSA) and KNBS. It wasn’t about the UK Code, but an opportunity to share our experiences and to support KNBS as they wrote their first code – finding the message that they wished to share with their organisations and analysts across the statistical system to inspire good practice. KNBS is committed to ensuring official statistics remain a valuable resource for all stakeholders in the statistical system in Kenya and to demonstrating international best practice. It sees the code as being central to achieving these goals.  

One of the lessons I took from being involved in developing the second edition of the UK Code was the power of explaining it simply, in just three words. And as we have seen in our own recent review of the Code, our three words – trustworthiness, quality and value – are extremely effective in helping people producing and using official statistics to understand what matters. My goal was to help my Kenyan colleagues find their own three (or more!) words. 

The dedication and skill of the team of statisticians and leaders at KNBS was evident and central to completing their task of writing their code. The Kenyan code had to be one that was meaningful and inspiring in their context – not just to staff in KNBS, the national statistical institute for Kenya, but for all other organisations in their statistical system.  

In two week-long workshops, Caroline Wangeci (the then ONS strategic adviser to KNBS) and I led sessions to develop the Kenyan code of practice. Our codes are both grounded on the UN Fundamental Principles for Official Statistics, but each speaks to specific needs within our countries. 

The commitment and dedication of the Kenyan team was inspiring to witness and humbling to assist. Writing a code of practice to set standards for statistical production and dissemination is challenging. But doing so in concentrated bursts of effort, away from day-to-day responsibilities, allowed the team focused time to think through the key values and messages needed to inspire those involved in producing official statistics. After having spent a day diving deep into code matters, many of the participants then spent further hours catching up on their work, ensuring the smooth running of their statistical teams.  

KNBS has published the Kenya Statistics Code of Practice (KeSCoP). It is a remarkable achievement in its clarity and mission. As the Code highlights,  

“KeSCoP is anchored on three pillars, namely, Quality, Trustworthiness and Progressiveness.  Each pillar contains values and commitments that producers of statistics should commit to when producing and disseminating statistics.” [page 1, KeSCoP] 

KeSCoP is drawn from the principles and requirements of the Kenya Statistical Quality Assurance Framework (KeSQAF). The Code helps the drive for continuous improvements in statistical practice.  

“Compliance with KeSCoP gives producers of official statistics and the users confidence that published statistics are of high quality, have public trust and are produced by institutions and people that are progressive.” [page 2, KeSCoP] 

It echoes our UK Code but in a truly Kenyan way – they have defined their concepts and practices in the ways that speak to their application of the UN Fundamental Principles for Official Statistics.  

Quality has values for:

relevance; timeliness and punctuality; accuracy; coverage/comprehensiveness; methodological soundness; professional skills and competencies; and data sources

Trustworthiness has values for:

impartiality and equal access; confidentiality; accountability and transparency; and integrity and credibility  

Progressiveness has values for:

innovation; continuous improvement; user focus; and adequate funding

I love the clarity of each value and commitment, providing a clear steer for any producer on what it looks like to deliver official statistics to the required standard, with associated indicators or measures of success.  

We have much that we can learn from KeSCoP as we refresh the UK Code and seek to ensure its clear communication and accessibility. 

To find out more about UKSA’s international development work, see ONS’s published strategic plan for 2022–2025. 

 

The importance of separation: Ed Humpherson addresses questions raised by the Lievesley review

In our latest blog, Director General for OSR, Ed Humpherson, speaks about how OSR’s separation from ONS is communicated

Since Professor Denise Lievesley published her review of the UK Statistics Authority, most attention has been on the Statistics Assembly. The Lievesley review considered the work of the UK Statistics Authority, and its first recommendation was that the Authority should hold a Statistics Assembly every three years. The Assembly should elicit and explore the priorities for the statistical system with a wide range of users and stakeholders. And the first Assembly will present a fantastic opportunity to have a wide-ranging conversation about statistics and enable a discussion about the strengths and limitations of the current statistical landscape in the United Kingdom. So, it’s not surprising that this recommendation has generated a lot of interest. 

The Lievesley review raised other important issues. Some of these issues relate to OSR. In particular, she highlighted that OSR needs to improve how it communicates the separateness of its role from ONS.  

Separation matters to us. Indeed, when we look at official statistics, we start by considering the trustworthiness of the governance processes used by the statistics producers – by which we mean the mechanisms in place to ensure and demonstrate that the statistics are free from the vested interest of the organisation that produces them – that they are the best professional judgement of the statisticians. 

Similarly, it’s important that our decisions reflect our best professional judgement and that, in relation to ONS, we can make those judgements without giving any weight to ONS’s own organisational interests. 

We have several arrangements in place to secure our separation from ONS. But if people don’t appreciate or understand them, they are not working. And the Lievesley review made clear that we need to better explain the processes that assure this separation to our stakeholders and the public. That’s why today we are publishing a statement that sets out, in formal terms, the arrangements that underpin our separation from ONS. 

The key features of the statement are: 

  • a summary of the governance arrangements, under which we report our work to a separate Committee of the Authority Board made up only of non-executive members and me – that is, with no membership role for ONS staff. 
  • a summary of the arrangements for setting strategy and budget, which involve me and my team making our proposals directly to this Committee and to the Board, with no decision-making role for ONS staff 
  • a confirmation of my own personal reporting lines, which mean that I do not report to the National Statistician in any way, but directly to the Authority’s Chair; and that I have regular meetings with the Chair without the National Statistician or any senior ONS staff attending 

Please read the statement if you’d like to learn more about these arrangements. 

But I’ll close with Denise’s own words. The most important features of any governance set-up are the budget and the performance management. And on this, she was clear:  

The existence of a small number of misunderstandings by users also appear to perpetuate, such as that the Director General for Regulation is line managed by the National Statistician (he is not) or that the National Statistician controls the budget of the OSR (he does not). Nor does the National Statistician attend Regulation Committee meetings.” 

I hope the statement we are publishing today helps provide some reassurance and address the issues of perception identified by Denise’s review.  

Embedding the habit of intelligent transparency

In our latest blog, Director General for OSR, Ed Humpherson looks at the importance of intelligent transparency for Governments across the UK.

Intelligent transparency is one of the most important set of principles that we uphold. When statistics and quantitative claims are used in public debate, they should enhance understanding of the topics being debated and not be used in a way that has the potential to mislead.  To help those making use of data in public debate, we have set out our three underpinning principles of intelligent transparency. These principles demand that statistical claims and statements are based on data to which everyone has equal access, are clearly and transparently defined, and for which there is appropriate acknowledgement of any uncertainties and relevant context.

We have promoted intelligent transparency to the UK Government and the Governments of Scotland, Wales and Northern Ireland. And the Chair of the UK Statistics Authority, Sir Robert Chote, set it out clearly in his letter to political party leaders ahead of the 2024 general election. We have also made a number of interventions to support the principle of equal access.

 

Intelligent transparency in conference speeches

Equal access means that all statements involving statistics or data must be based on publicly available data, preferably the latest available official statistics where possible. Claims should not be made based on data to which ministers have privileged access, as this prevents the claims from being scrutinised and undermines confidence in official statistics.

We recognise that conference speeches by Ministers in particular can be difficult, as noted in this blog we published in September. Ministers want to highlight policy successes to their party members, but they do not have the input of the civil servants who would normally ensure that their statements use statistics and data appropriately.

In this context, we were recently made aware of a statement made by the Prime Minister, Sir Keir Starmer at the Labour Party Conference regarding immigration returns. The claim in question was that there has been “a 23 per cent increase in returns of people who have no right to be here, compared with last summer”. At the time the Prime Minister made this claim there was no Home Office data or statistics available in the public domain for the relevant time period to support this statement.

 

The importance of publishing data used in the public domain

Following the statement made by the Prime Minister, we engaged with the Home Office and we welcome the ad-hoc statistical release published by the Home Office, which provides the underlying data that relate to this statement. In most cases we would want to see the release of unpublished data very soon after the statement itself. But we do understand that sometimes, as in this case, providing the underlying data in a usable format may take longer. We consider that the approach taken by the Home Office is in line with the principles of intelligent transparency. It is a good example of how to resolve a situation when unpublished information has found its way into a speech.

 

Working with Government to ensure best practice

 We are using this example as an opportunity to re-emphasise the importance of intelligent transparency to a range of audiences, including Heads of Profession for Statistics, the wider Government Analysis Function, and the Government Communications Service. We have also discussed the issue with officials in Number 10, who have confirmed the importance they attach to appropriate use of data. We are happy to provide support and advice to departments on the principles of intelligent transparency for analysts, communication professions, special advisers, and any other colleagues who may benefit. To discuss this with us please contact us via regulation@statistics.gov.uk.

For all Governments, it is important to avoid the use of unpublished information in the public domain. We are always happy to work with officials and advisers to embed good habits of intelligent transparency as fully as possible.

 

Improving mental health services in Northern Ireland: The Regional Mental Health Outcomes Framework

In our latest guest blog, Oscar Donnelly, Lead for the Mental Health Outcomes Framework, discusses the outcomes-based approach to collect standardised mental health data and statistics using the new patient record system ‘Encompass’

In 2021, OSR’s Review of Mental Health Statistics in Northern Ireland highlighted a scarcity of robust mental health data for Northern Ireland, revealing significant data gaps. The lack of data on the outcomes of mental health services, in particular, made it impossible to answer questions people may have, such as whether services are effective in the treatment and care they provide.

These concerns were amplified in a 2023 Northern Ireland Audit Office (NIAO) report that pointed to a need for improved data around mental health services. NIAO emphasised that establishing an appropriate framework for measuring service outcomes would help support decision-making and the monitoring of services’ effectiveness. It was clear to us that addressing these concerns would be an ambitious but necessary undertaking.

Our Ambition to Improve Mental Health Data

Northern Ireland’s Mental Health Strategy 2021-2031 recognises the deficit in information on mental health care outcomes, similarly highlighting a need to “Develop a regional Outcomes Framework in collaboration with service users and professionals, to underpin and drive service development and delivery.” The strategy indicates that a Mental Health Outcomes Framework (MHOF) would support the use of evidence as the foundation for decision-making. Further, it should establish a comparable data set across the five Health and Social Care Trusts to allow us to measure performance and determine how best to improve health and social care services for the Northern Ireland public. As a result, in 2021, I was asked by the Department of Health (DoH) to lead on the development of a Mental Health Outcomes Framework for Northern Ireland. This is the story of how we developed and are implementing the framework and reflects on our progress and priorities so far.

Developing the Framework

During early engagement, I was advised by a psychiatrist colleague that for a Mental Health Outcomes Framework to be accepted and used in services, it would need to be clinically relevant to staff and meaningful to service users. So, it became clear to me that the primary focus of such a framework should be to improve outcomes for service users by supporting evidence-based clinical practice. Collecting data on overall service activity and performance, although important, is secondary to this objective.

A regional steering group comprising a range of stakeholders from across professions, mental health services, organisations and sectors was established to oversee the work. This included people with lived experience as service users and carers and representatives of both the DoH Information and Analysis Directorate and the Planning and Performance Group regional commissioner. Further input and review were provided by academics from Queens University Belfast and a consultant clinical psychologist with expertise in mental health data and statistics, outcome measurement and analysis and, most importantly, the practical use of outcomes measures within services.

The steering group met monthly, with a smaller working group drawn from the steering group meeting weekly to drive the work forward.

Three stakeholder workshop events were held in producing the framework: two for mental health service users and carers and one for mental health professionals drawn regionally from across service types and providers. A reference group of professional staff was also established to advise on the selection of appropriate measures for different mental health services.

The Outcomes-based Accountability approach

The Northern Ireland Government uses an outcomes-based accountability (OBA) approach to outcomes measurement, which we adopted for the framework. OBA measures are built around three core questions, which we used to inform the basic structure:

How much did we do?

We completed an audit that identified the poor quality of currently available data on community mental health services for adults and older adults. We found that the deficiency in these data relates both to the data being generated through a range of independently developed Trust IT systems and, more problematically, the variability of basic mental health service structures across Trusts. As a result, the routine collection of regionally consistent data is highly challenging.  To address these issues, we developed a “basic metrics” template to improve the quality and consistency of data on mental health activity, applying a regional reporting template to structure the information inputted by each Trust.

How well did we do it?

We considered both service users’ and carers’ experience and reflected on what is meaningful to them when they access mental health services. These measures were co-produced with service users and carers and then tested in services. Separate questionnaires were developed for service users and for carers, with adapted versions for each of the Child and Adolescent Mental Health Services (CAMHS) and perinatal mental health services.

Is anyone better off?

Working with clinicians from across services and Trusts, we identified a range of appropriate and validated tools for measuring clinical outcomes. These included universal measures of emotional wellness to be used across all mental health services and a portfolio of condition and service specific measures. This portfolio reflected that people access mental health care for many reasons, so services should measure outcomes across a wide range of individuals presenting with varying needs. We identified a total of 47 measures of clinical outcomes for the MHOF.

Implementing the Framework

We engaged widely with international mental health outcomes programmes, particularly in Australia and Scotland, to learn lessons and maximise the success of the framework. This helped us identify that the implementation phase is where outcome measurement initiatives are most at risk of either failing or underachieving. The plans for implementation and operation need to inform each step of the development of a mental health outcomes framework.

Most importantly, implementation must be seen as more than just a technical exercise; one of the biggest challenges to success is ensuring that mental health practitioners and services value and routinely use the outcome measures in their day-to-day practice.

Ensuring that outcomes measures are accessible and easy to use in busy clinical settings is therefore critical. It is hugely opportune, then, that Northern Ireland is currently investing in a new regional electronic patient record system: Encompass. This system will replace the various electronic patient record systems across Trusts with one that creates a single digital care record for every person in Northern Ireland receiving care. In this way, Encompass will be an important enabler to ensure successful implementation.

Where are we now?

The co-produced regional MHOF was approved for implementation by the local health minister in October 2022.

We are currently in Stage 1 of implementing the framework. Our aim, during this phase, is to embed the framework measures during the build of the Encompass system and establish the capacity to report robust regional data on mental health service activity. We plan to complete this work by early 2026.

Work is progressing, with approximately 60% of measures having now been quality assured, license arrangements sorted, and formats digitised onto the Encompass system. We are also engaging with mental health and informatics staff across Trusts to implement basic activity metrics.

A challenge around capacity has also emerged. As the Encompass system starts to go live in each Trust, Encompass systems staff will be confronted with competing priorities and naturally, they prioritise supporting the clinical functionality of the new system over other development work.

What’s next?

Stage 2, which we hope to start in 2025, will involve supporting mental health professionals and services in using the outcomes measures embedded in the Encompass system during Stage 1. This will require a phased programme of engagement with teams and clinicians and regional clinical networks to select appropriate measures for similar services across Trusts and to regionally operationalise these in a robust and consistent way. It will involve training and supporting teams and services across Trusts to integrate the use of outcomes measures within their clinical practice. It will also require engagement with service users to test and evidence its acceptability and to determine how best to support them.

In Stage 2, we will explore international examples of services that have successfully implemented outcome measurement into their clinical practices. Our aim is to embed the framework with selected services to support outcomes reporting by late 2026. The second phase will also include the data collection and reporting of service user and carer experience, alongside the development of a patient portal on Encompass.

Implementing and operating an ambitious framework such as this requires dedicated resourcing. As such, a dedicated regional programme lead will be recruited, with this role expected to be filled by winter 2024/25. It is also hoped that MHOF champions will be appointed in each of the five Health and Social Care Trusts. Though these roles are still awaiting funding, they will likely be critical to the success of the framework and its adoption by services.

Funding has been secured for a Regional Business Unit, who will work with the MHOF programme lead to ensure the development of the OBA reporting capacity in the Encompass system. Dashboards will report routine measurements as standard. They will also have the capability to further analyse MHOF data against other demographic and clinical information, increasing our understanding of how we can best respond to the needs of service users.

The future of outcomes measurement

Implementing outcomes measurement in mental health services is a long-term strategic commitment that requires ongoing engagement, support and practice development.

The framework’s multi-faceted approach to improving our information on mental health services will enable services to develop and adapt in a changing landscape where resources are limited and demand is growing. It will help us assess the effectiveness of services so that we can maximise their benefits, address issues such as long waiting lists and improve the quality of care and outcomes.

We expect to begin collecting statistics from the Encompass system from Spring 2026, which should address the issues around comparable data.

For further information on this work, please contact Oscar Donnelly, Chair of the Mental Health Outcomes Framework: Oscar.Donnelly11@outlook.com.


Related Links:

Review of mental health statistics in Northern Ireland

Learning lessons from statistics: My experience as an intern at OSR

When I was placed in OSR for an eight-week internship, as part of the Civil Service summer internship scheme, I didn’t know what to expect. I study English literature at Oxford University, so statistics are not a part of my day-to-day life – or so I thought. OSR has shown me that statistics are the ‘lifeblood of democracy’, informing the decisions we make every day.

OSR casework and election lessons learned

OSR’s vision is simple – statistics should serve the public good. I quickly learned that for statistics to serve the public good, they must be communicated to that end. OSR’s work is rooted in communication. The pillars of the Code of Practice – Trustworthiness, Quality and Value – ensure the statistical story is told to users transparently.

While working with the casework team, I saw the importance of communication at ground level from the user complaints and queries that OSR receives. I shadowed some cases with different subject domains to better understand how and when they intervene when statistical claims are made.

The ‘lessons learned’ work I did, which looked at how OSR’s casework and impact during the pre-election period might inform future interventions, was a highlight of my placement. I posed questions to the election response team and led the discussion on the pre-election casework. I also reviewed web analytics and compared user view and engagement spikes on our ‘what to look out for’ webpages, which addressed common statistical claims on education or the health service, with media articles that came out on those days, highlighting the alignment of our explainers with public interest. The project recommended improvements to the casework process, such as creating a list of useful data sources and using explainer statements to address commonly contested claims beyond the election period.

In summary

Across all my projects, which also included contributing to the blog-writing process with the communications team and working on a post-election survey for Heads of Professions with the research function, I’ve encountered the core spirit of collaboration that underpins OSR. No more than 50 people work here, but our outputs convey the strength of this team. Here, everyone is trustworthy, valued and working to the highest quality; OSR embodies the very code it regulates.

Commenting on conference speeches

In our latest blog Head of Casework, Elise Rohan, talks about claims made during political party conferences and our expectations of producers in this period…

Every autumn, political parties in the UK host their annual conferences in what is known as ‘party conference season’. We were recently asked about our approach to intervening in speeches and statements made in these conferences under our responsibility to protect the role of statistics in public debate.

As with any concerns raised with us, our approach is guided by our interventions policy. It sets out how we use our voice to stand up for statistics, reporting publicly where we consider there is a likelihood of the public being misled on an issue of significant public interest.

We recognise that party conferences, much like election periods, require careful judgement about when to intervene. We are not moderators of political debate, and we understand that it is part of the democratic process for political parties to draw on a wide range of sources, including statistics, to persuade potential voters. Our focus is on ensuring statistics are not being misrepresented in these statements and speaking up where we identify the potential for the public to be misled.

Ahead of the 2024 UK General Election, we carried out dedicated monitoring of party manifestos, debates, speeches, and interviews given by members of political parties. While we do not take this approach to monitoring statements made during party conference season, our expectations for producers during this period remain the same.

  • We do not expect producers to respond or publish an ad-hoc report for general statements made in a party conference. For example, where politicians make generalised comparisons of track records between political parties. This would not be proportionate, nor appropriate given conference speeches are political and should not involve statistics producers.
  • However, in instances where a statement makes specific reference to statistics which aren’t in the public domain, we would expect producers to follow our intelligent transparency guidance for responding to unplanned releases of data.

For those seeing these statements, the most important thing to help combat the potential for statements to mislead is to develop the skills to critically challenge what you see and get in touch with us if you have concerns.

Data sharing and linkage for the public good: breaking down barriers

Following the publication of OSR’s report on enabling greater data sharing and linkage for research and statistics for the public good, Head of Development Helen highlights initiatives helping to overcome barriers to data sharing, access or linkage across government. These initiatives could benefit analysts and organisations within and beyond government.

Our findings from our report on enabling greater data sharing and linkage are clear: sharing and linking of datasets is still the exception, rather than the norm.

Barriers to progress persist. These include a lack of leadership in championing a cross-government approach to data sharing, uncertainty around funding and resourcing, a lack of clarity about data access processes, and nervousness about the safety of data sharing and social licence for data sharing and linkage. We call for leadership, including political leadership, and an approach that starts with thinking about the benefits of data sharing and linkage to society, then supporting the removal of barriers to deliver these benefits, in a secure way. We have reiterated this sentiment more recently, in our statement on how Government can support the statistics system to be at its best now and in the future.

Examples of initiatives enabling data sharing and linkage

Among all these firm messages, the thing I want to make sure isn’t lost is that many teams and organisations are spearheading impressive initiatives that are helping to overcome barriers. These will benefit analysts and organisations within and beyond government who want to share, access or link data. Below I highlight initiatives we heard about while researching our report, grouped under the themes of our findings.

Public engagement and social license

It is important to obtain social acceptability for data sharing and linkage. Engaging the public to track attitudes towards data sharing and linkage remains vital, so that the social licence is understood and maintained. But there can be a lack of understanding about how to carry out public engagement in a meaningful way.

  • The Public Engagement in Data Research Initiative (PEDRI) is developing good practice guidelines to help researchers and other analysts conduct public engagement in data research and statistics.
  • The Public attitudes to data and AI tracker survey published by the Department for Science, Innovation and Technology (DSIT), provides insight into how society views data use. The 2024 survey, due to be published in December, will include specific questions on public attitudes to data sharing and linkage.
  • The Administrative Data Research UK (ADR UK) Learning Hub contains useful resources on public engagement in practice and brings together information on skills and resources for those using administrative data and data linkage.

The public’s key concern regarding data use remains data security. It is likely that many people’s attitudes towards data sharing and linkage continue to be influenced by concerns around data security.

Good leadership, and the skills and availability of staff

At every step of the pathway to share and link data, the people involved, and their skills and expertise, are key to projects’ success or failure. A lack of awareness, priorities and capability of people involved in decision making and development, including senior leaders, analysts and those in data governance roles, can impose barriers.

  • Programs like the Digital Excellence Programme, which trains civil servants in data literacy and AI can strengthen data literacy among leaders in government.
  • The Data Science Campus, part of the Office for National Statistics, has launched One Big Thing – a new learning initiative, which in 2023 was focused on strengthening civil servants’ data skills. This course is now available for all civil servants on Civil Service Learning.
  • The Data Linkage Champion Network provides a forum for civil servants of all grades and from across government to discuss and promote data linkage.

Hesitancy around data sharing and linkage due to potential overcaution remains a major barrier. The issue is not always that government departments disagree over whether to share data. It can also be that everyone agrees to a data share in principle, but it does not advance because of the complexities of the process.

  • The Central Digital Data Office (CDDO) has established the Data Sharing Network of Experts (DSNE) to help departments deal with questions or differences of opinion around data sharing.

Non-technical processes that govern how data sharing and linkage happens across government

When an external researcher or government analyst wishes to access data there are several steps they follow. Firstly, they must know the data they wish to access and where they are held. Secondly, they must establish the legal route to the data and, finally, they must gain access to that data. Each of these steps can pose barriers.

  • The UK Statistics Authority (UKSA) has published an online resource answering frequently asked questions about the Digital Economy Act (DEA, 2017) Research power and the Statistics and Registration Service Act (SRSA, 2007). It covers what data can be accessed via the Research power and SRSA, accredited processing environments and who can access data and for what purposes.
  • ADR UK’s online Learning Hub includes information on the DEA (2017) for researchers wishing to access administrative data under this legislation. It has a slide deck produced by the UKSA that explains what the DEA 2017 Research power allows for and contains a visual map of the data access journey.
  • The CDDO is developing a data marketplace to improve the discoverability of data within government. The marketplace allows those within government to find out what data are held and how data can be accessed.
  • ADR UK has created a searchable public metadata cataloguethat contains information about the datasets held across the ADR UK partnership. It includes a webpage for each dataset with links to information on how to access the data, as well as a description of the dataset.
  • HDR UK has created a searchable public metadata catalogue which contains information from over 850 different health-related datasets across the UK.
  • The Pan-UK Data Governance Steering Group was established by the UK Health Data Research Alliance, convened by Health Data Research UK (HDR UK), to simplify and streamline data access governance processes. The Steering Group co-developed and published Transparency Standards with HDR UK’s Public Advisory Board (PAB) to guide good practice.
  • Research Data Scotland runs a Researcher Access Service for those wishing to access public data in Scotland. It publishes a data access overview describing the stages of data access – from discovering data to receiving access.

Technical specifics of datasets, and the infrastructure to support data sharing and linkage

Technical challenges can pose significant barriers to effective and efficient data sharing and linkage, including problems presented by the recording of accurate metadata.

  • CDDO is leading efforts to help departments identify their Essential Shared Data Assets (ESDAs) – data assets that are critical from a cross-government perspective. This includes the creation and maintenance of accurate metadata, which will improve the documentation of data held by government.

Alongside initiatives that could have widespread benefits for eroding barriers, our report highlights examples of specific analytical projects and programmes that have successfully used linked data to deliver impactful analysis. These case studies offer a window on what can be achieved when data are shared or made accessible.

We want to hear from you

OSR is always delighted to hear about and champion work that demonstrates or enables effective data sharing, access and or linkage. If you have a case study or would like to discuss our work in this area, please get in touch: regulation@statistics.gov.uk.