In the last few weeks, we’ve made three comments on health statistics – one in England, about leaks of accident and emergency data; one in Scotland, on statistics on delayed discharges; and one on analysis at the UK level. They all show the importance of improving the public value of statistics.

On accident and emergency statistics, I wrote to the heads of key NHS bodies in England to express concern about recent leaks of data on performance.

Leaks of management information are the antithesis of what the Office for Statistics Regulation stands for: public confidence in trustworthy, high quality and high value information.

It’s really hard to be confident about the quality of leaked information because it almost always lacks context, description, or any guidance to users. On value, leaked information usually relates to a question of public interest, but it’s not in itself valuable, in the sense it’s not clear how it relates to other information on the same topic. Its separated, isolated nature undermines its value. And it’s hard for leaked information to demonstrate that it is trustworthy, because the anonymous nature of the “producer” of the information (the person who leaked it) means that motives can be ambiguous.

But leaks can highlight areas where there is concern about the public availability of information. And that was the constructive point of my letter: the NHS bodies could look into reducing the risk of leaks. One way of doing this would be to reduce the time lag between the collection of the information on accident and emergency performance, and its publication as official statistics. This lag is currently around 6 weeks – 6 weeks during which the performance information circulates around the health system but is not available publicly. Shorten this lag, I argue, and the risk of disorderly release of information may also reduce.

The comments on Scotland relate to the comparability of statistics across the UK. When NHS Scotland’s Information Services Division published its statistics on delayed discharge from NHS hospitals for February, the Cabinet Secretary for Health and Sport in the Scottish Government noted that these figures compared positively to the equivalent statistics in England.

This is of course an entirely reasonable thing for an elected representative to do – to comment on comparative performance. The problem was that the information ISD provided to users in their publication on how to interpret the Scottish statistics in the UK context was missing – it wasn’t clear that Scotland figures are compiled on a different basis to the England figures. So the comparison is not on a like for like basis. The difference wasn’t stated alongside the equivalent statistics for England either. This clarification has now been provided by ISD, and NHS England have agreed to make clearer the differences between the figures in their own publication.

For us, it’s really important that there is better comparability of statistics across the UK. While there are differences in health policy that will lead to different metrics and areas of focus, it’s quite clear that there is public interest in looking at some issues – like delayed discharge – across the four UK health systems.

In this situation, good statistics should help people make sound comparisons. Yet, with health and care being a devolved matter, there are some constraints on the comparability of statistics across England, Wales, Scotland, and Northern Ireland.  And, to the untrained eye it is difficult for users to know what is or is not comparable – with delayed discharge data as a prime example. This is why we really welcome the recently published comparative work, led by Scottish Government, where statisticians have created a much more accessible picture of health care quality across the UK, pulling together data on acute care, avoidable hospital admissions, patient safety, and life expectancy/healthy life expectancy across all 4 UK countries.

Both these cases – the leaks and comparability – illustrate a broader point.

Health statistics in the UK should be much better. They should be more valuable; more coherent; in some cases more timely; and more comparable. If statistics do not allow society to get a clear picture in good time of what is going on, then they are failing to provide public value.