Following the publication of our Analytical leadership: achieving better outcomes for citizens report in March 2024, we are running a series of blogs to highlight examples of strong analytical leadership in practice. Analytical leadership is a professional way of working with data, analysis or statistics that ensures the right data are available for effective policy and decision-making to improve the lives of citizens. Everyone in government can demonstrate analytical leadership, regardless of their profession or seniority by drawing on the six enablers of analytical leadership and a ‘Think TQV’ approach.
Our latest blog in this series is from Robert Cann, Policy and Government Relations Manager at Full Fact. Robert highlights that the way statistics are presented is crucial for how they are interpreted and understood by the public. Robert finds that presenting statistics without relevant context or caveats, describing them incorrectly, or giving them too much weight, can give an incomplete or misleading picture and misinform public debate.
Full Fact’s work has clear relevance to our analytical leadership findings, particularly the importance being transparent about the evidence used; leading through the effective communication of evidence; and challenging evidence misrepresentation and misuse. ‘Demonstrating transparency and integrity’ in these ways has the potential to build public confidence in how analytical evidence is produced and used, and in the policies and wider decisions based on that evidence. It is also directly relevant to OSR’s work on Intelligent Transparency.
We’ve had a new government in place for nearly two months. Will we see an improvement in how the Labour Government’s ministers use data, compared with their predecessors? In this blog I look at how Full Fact will be monitoring their behaviour, and set out what we would like to see in terms of statistical best practice. But first some background.
Full Fact is a team of independent fact checkers and campaigners who find, expose and counter the harms of bad information. We do this in the following ways:
- We fact check claims made by politicians and public institutions, and also look into viral content online;
- We then follow up on these, to stop and reduce the spread of specific claims;
- We campaign for systems changes to help make bad information rarer and less harmful, and advocate for higher standards in public debate.
We were founded in 2010, and are the UK’s leading independent fact checking organisation.
The recent general election presented us with a specific and intense period of political fact-checking activity: over the course of the election campaign we estimate we published 217 verdicts on claims or repeated claims. But now our focus returns to the more day-to-day task of keeping a close eye on everything the new government says, in particular how it uses statistics. The way statistics are presented is a crucial part of how they are interpreted and understood by the public. If they are presented without context or caveats, described incorrectly, or given too much weight, this can give an incomplete or misleading picture, which can then misinform policy debate and development.
If trends continue in the same vein as the previous government, we’re likely to be kept quite busy. When it comes to misuse of data, we tend to see three broad categories – each of which I’ll illustrate using an example fact check of the previous Prime Minister. Firstly, claims based on unpublished data: we found no evidence to support Rishi Sunak’s claim that there were 6,000 fewer people in the asylum backlog. Secondly, claims based on non-existent data. Last summer, data didn’t back up the then PM’s claim that A&E waits were ‘the best in two years’. Finally, and perhaps most common, are claims based on selective use of data. In November last year, Rishi Sunak selectively used data to claim that the UK was doubling aid for Palestinian civilians, failing to mention that in prior years it had been cut drastically: thereby rendering a claim of ‘doubling’ somewhat misleading.
So much for the bad. What does good look like? An excellent place to start is, of course, adhering to the Code of Practice for Statistics, overseen by the OSR. The Code is clear that government statements that refer to official statistics should contain a prominent link to the source data, and be accurate, clear and impartial.
We recognise that in some cases misleading use of data can happen in error. So after fact checking government statements, we usually choose to request that the error is corrected, or that further context is provided. Sometimes this leads to a welcome correction or clarification. For example Rishi Sunak corrected the record after we pointed out to him that his claim on employment figures in the North Sea oil industry were incorrect.
Earlier this year we intervened after publishing a fact check on NHS England’s erroneous claim that 3.4 million children in England were “unprotected” against measles. It was important we took action: this 3.4 million figure had subsequently spread to multiple news reports, as well as in statements from ministers in the House of Commons and House of Lords. The figure was incorrect because it was actually an upper estimate for the number of children who might have missed at least one dose of the MMR vaccine. It included many children recorded as having had one MMR dose who are considered “not fully protected” rather than “unprotected”, and is likely to have included some children who hadn’t missed a dose but whose records weren’t up to date. The true number who are actually “unprotected” against measles is likely to be much lower.
Once we had published our fact check, we contacted online media outlets that repeated the claim. At the time of writing, The Mirror, The Independent, Mail Online, Metro and Associated Press have made corrections, and The Telegraph has added a clarification.
We also contacted NHS England, the OSR, and the government ministers (Maria Caulfield and Lord Markham) who had spoken in Parliament repeating this claim. NHS England published a detailed correction to its statement. Amendments were also made to Maria Caulfield and Lord Markham’s statements in the House of Commons and House of Lords respectively.
By taking action to correct the record in this way, both NHS England and government ministers acted in the spirit of the OSR’s Analytical Leadership enablers. In particular, enabler 2, which is to:
Demonstrate transparency and integrity. This enabler demonstrates the need for analytical evidence to feed into policy and meet public needs in an orderly and transparent way; to demonstrate leadership through the effective communication of evidence; and the integrity to challenge evidence misrepresentation and misuse, including by correcting the public record.
We don’t always get such a great result from our interventions though. When misleading statements are made repeatedly, even after the error has been pointed out, then this may be being done in pursuit of political advantage. So it’s time that systems were in place to minimise the temptation to pull the wool over the public’s eyes in this way.
Therefore we are making the following policy recommendations to ensure that government ministers and departments use official statistics and analysis in a way that is fair, accurate and transparent:
- The previous government’s Ministerial Code stated that ministers ‘need to be mindful’ of the UK Statistics Authority’s Code of Practice for Statistics when using official statistics. Keir Starmer is likely to soon publish his own version of the code. We want this to be strengthened to make it clear that ministers must adhere to the principles of the Code of Practice for Statistics for all data they use to back up statements they make. Furthermore, we want the Ministerial Code itself to become statutory.
- Government ministers and departments should correct themselves when challenged and ensure that errors are not repeated. The OSR must continue to act swiftly and publicly to call out non-compliance.
- Permanent Secretaries and the Heads of Professions for Statistics should take the lead in fostering a culture of transparency and integrity within their departments and ensuring staff are properly trained so that they understand these expectations and have the skills to meet them.
- Parliament and Select Committees should take a more active role in scrutinising and holding ministers and departments to account about the way they evidence their claims. To support this, each department’s annual report should report any concerns raised publicly by the OSR and set out the department’s response.
We will be raising these issues during the course of our regular dialogue with government ministers and parliamentarians, and where appropriate via public campaigns.
For further reading about Full Fact’s work in this area, do check out Part 2 of the 2024 Full Fact Report, Trust and truth in the age of AI.
Robert Cann, Policy and Government Relations Manager
Blogs in this series:
- Fostering a robust government evaluation culture
- Transparency, integrity, and independence: the keys to improving Budget scrutiny and public understanding of risks to public finances
- Achieving linked data insights to improve lives: a leadership perspective
- Collaborative leadership: Drawing on our different strengths to answer important questions
- Ensuring that analytical leadership is fit for the future
- Demonstrating transparency and integrity to support public trust