2023/24 casework themes

Securing greater impact through partnerships

The essence of what we do is forming judgements about the way statistics have been presented and used in the public domain. However, we recognise that the format of our public interventions (that is, formal correspondence published on our website) may sometimes limit the audiences who become aware of our judgements.

Members of the public remain the main contributor to casework cases, accounting for 55% of cases in 2023/24. In these cases, the member of the public who raised the case will be aware of our judgements through our direct response to them. This year, however, we have seen an increase in cases being referred to us by the media, from 5% in 2022/23 to 10% in 2023/24, and these cases automatically gain wider attention through the media outlet’s reporting of our judgement.

We have been exploring ways we might magnify our impact through working in partnership with others.

In last year’s annual review of casework, we noted how topics being discussed in mainstream and social media were influencing the topics raised with OSR. This year, we have sought to respond to this trend by building greater links with the media.

The case study below demonstrates the value of OSR working with the media to reach a wider audience on issues that are of high public interest – particularly where additional clarification is needed to understand the validity of a statement.

Case study – Asylum backlog figures

In January 2024, the then Prime Minister, Rishi Sunak MP, published a post on social media that claimed the asylum backlog had been cleared. This post quickly drew attention from the media and public. Having had previous engagement with journalists on the topic of asylum backlogs, these journalists were quick to reach out and ask OSR to judge whether the statement was supported by the available evidence. This claim was also raised with us by members of the public and two members of Parliament – Alistair Carmichael MP and Stephen Kinnock MP.

Upon investigation, we were pleased to find that the Home Office had published an ad hoc release on the day of the announcement to support understanding. The data showed that as of 28 December 2023, there were 4,537 cases on the legacy backlog. These data were included in a press release that was clearer that the announcement related specifically to clearing the legacy asylum backlog and explained that while all legacy cases had been reviewed, there were still 4,500 complex cases that required additional checks for a final decision to be made.

We concluded that the average member of the public was likely to interpret “cleared a backlog” – especially when presented without context on social media – as meaning that it has been eliminated entirely. However, we also recognised that there may be good reasons for excluding complex cases from any commitment to eliminate the backlog over the timeframe the government chose, but this argument was not made at the time the target was first announced or made clear in the subsequent announcement that the target had been met. This element of our judgement made an important distinction between the Home Office press release and the Prime Minister’s social media post.

While we considered that the Home Office had taken actions to support intelligent transparency, the Prime Minister’s social media post highlighted the need for ministers to think carefully about how a reasonable person would interpret a quantitative claim of the sort and to consult the statistical professionals in their department. As well as responding publicly to Alistair Carmichael MP and Stephen Kinnock MP, we took the time to speak with the journalists who had raised their concerns with us and explain our rationale for making a distinction between the press release and the social media post. This resulted in the media coverage more accurately reflecting our judgement and providing clarification to the wider public about the accuracy of the statement, including what the data did and didn’t show.

Over the last year, we have noted a shift in the types of cases that have been raised with us by media organisations. Cases raised by the media have previously focused on whether we as the statistics regulator deem a claim to be misleading. In 2023/24, we have seen an increase in cases raised by the media where a journalist has been unable to source data to verify a claim being made. This is in line with a broader trend in OSR’s casework, where cases raised relate to intelligent transparency rather than the accuracy of the figures.

The media are not the only partnerships we have been working with to have greater impact with our casework in 2023/24. We continue to work closely with independent fact-checking organisations, such as Full Fact, and other regulators whose visions are aligned with ours and who support the good use of evidence in public debate. In some cases, the value of these partnerships is in having multiple voices addressing the same issue from different angles. This can both raise awareness of the issue and empower members of the public to raise their own concerns about it. The case study below demonstrates the value of partnerships as a way of highlighting the role of OSR’s interventions.

Case study – Comparisons of NHS performance across the UK

On 3 October 2023, the then Secretary of State for Health and Social Care, Steve Barclay MP, claimed ‘The number of patients in Wales escaping to seek treatment in England has increased by 40% in two years’. The issue of UK comparability for statistics on NHS performance was something OSR had been exploring for some time, following similar comparisons made by other members of Parliament.

Full Fact published a fact check that concluded that the claim was based on unpublished data. Full Fact made us aware of its fact check and asked whether OSR was planning to take any action as a result of the claim. From our ongoing engagement with the relevant producers, we were aware that there were significant limitations in the comparability and coherence of these data due to devolution of policy and differences in operational delivery. There was a risk that pushing for the publication of the data referred to by the then Secretary of State had the potential to mislead people, as the comparisons were not suitable.

Instead, we used this as an opportunity to highlight to Full Fact the work taking place across the UK statistical system (government departments, health departments and health bodies) to make it easier to understand the comparability of health data for England, Wales, Scotland and Northern Ireland. We supported ONS in publishing a blog post summarising the work to increase transparency about these data and the plans for their development. Working with Full Fact and the statistics producers in this way allowed OSR to take a proportionate response to the issue and ensured the priority was addressing the data gap rather than providing data to support an incorrect claim.

Taking a broad view of our remit to promote transparency across all data

While our statutory remit covers official statistics, we take a broad view of what falls within our scope. For many who see the publication of numerical information, the distinction between official statistics and other data may not always be clear. As a result, we may consider issues that could be perceived as official statistics, whether or not they are formally classified as official statistics. When we consider cases that are outside our usual area of focus, but nevertheless relate to the presentation of quantitative information in the public domain, our focus is on how the use of these numbers complies with our principles of intelligent transparency.

As awareness of OSR increases, so has the range of organisations and types of data featured in the cases raised with us in 2023/24. In some cases, the concerns raised with us relate to organisations that produce official statistics but the data in question are not official statistics. We recognise that in these circumstances, the distinction between official statistics and data may not be clear, and we are willing to intervene to support public understanding even if the quantitative information in question is not classified as official statistics. Such interventions could include the use of numbers based on evaluation or modelling exercises; commitments to spending or levels of resources allocated to a policy area; or forecasts of future outturns of fiscal or economic statistics. The case study below relates to concerns raised about the impact assessment report produced by the Welsh Government.

Case study – 20 mph limits in Wales

Following the implementation of the Welsh Government’s 20 mph speed limit, OSR received multiple concerns about the transparency of sources supporting claims on the implementation – in particular, in a promotional leaflet sent to households in Wales.

While the claims in the promotional material cited the sources for the figures quoted, we identified that improvements could have been made to aid transparency and better support users. The main source used for the claims was an Explanatory Memorandum to the Restricted Roads (20 mph Speed Limit) (Wales) Order 2022 from June 2022. The document did include some high-level information on how the increase in overall journey times was derived, and the uncertainty and limitations of the estimate were acknowledged. However, we judged that it would be challenging for a reader to unpick this detailed document to find and understand the data and calculations used to support this claim, and that it would not be reasonable to expect people who read the leaflet to do so.

To better support users in future, we wrote to the Welsh Government and recommended that it, in collaboration with Transport for Wales, should consider how it could make improvements to data and statistics on the implementation of the 20 mph speed limit, including considering the accessibility of data and supporting information.

We have also seen an increase in cases raised with OSR around the production or use of data by organisations outside government. This includes issues related to organisations that have voluntarily adopted the Code of Practice, or where there are concerns around local authority data which are of national significance. In cases where we have judged that it would be appropriate for us to comment and we could add value, we have been willing to intervene in an advisory capacity. The case study below relates to an organisation that voluntarily applies the Code of Practice for Statistics but where the data in question were of high public interest.

Case study – ULEZ compliance

Munira Wilson MP raised concerns with the use of statistics on the Ultra Low Emissions Zone (ULEZ), specifically with the Mayor of London’s claim that ‘9 in 10 cars seen driving in outer London meet the ULEZ standard’.

While Transport for London (TfL) is not an official statistics producer, the Greater London Authority Group (which covers TfL) signed up to voluntarily apply the Code of Practice for Statistics in 2018. As such, TfL agreed to follow the principles of Trustworthiness, Quality and Value set out in the Code and to ensure that data were publicly available to support statistical claims, in line with our intelligent transparency principles.

After engaging with TfL, we were unable to source underlying data to support the claims made in its press release and so we requested that TfL provide more information around its claim. TfL created and published an explainer page on its website highlighting how it calculated the compliance rate. TfL was also keen to ensure that data were made available in future to support claims, an improvement that we encourage and support.

We responded to Munira Wilson MP’s concerns highlighting the explainer page and confirming that we were satisfied that the data collected by TfL supported the statement that 9 out of 10 cars seen driving on an average day in outer London met the ULEZ emissions standards.

Improving our casework service

Our casework volume remains higher than pre-pandemic levels. The volume of casework we receive continues to highlight the importance of the work we do to stand up for statistics and report any concerns on the quality, good practice and comprehensiveness of official statistics.

We have seen some improvements in our service delivery for casework when compared to last year. Our median number of days to close a case for 2023/24 has decreased by nearly 50% to 14 days when compared with 27 days in 2022/23. We have also started to analyse the number of topics raised in our casework as well as the number of individual pieces of correspondence. In 2023/24, there was little difference in the number of topics raised each month and the number of individual pieces of correspondence. This demonstrates the breadth of topics raised with OSR. We will continue to analyse these data to understand what it means for our casework service.

In 2024/25 we want to further improve our casework by focusing on the perspective of the complainant. We are looking to improve the transparency of our interventions and are looking to use different communication platforms to share our thinking and judgements. One such platform is LinkedIn, where we have started publishing a monthly newsletter about our casework to give better insight into the work we do at OSR. We are also launching a feedback survey for those who raise concerns, and we intend to report the findings from this survey in next year’s casework report.

Back to top