Health statistics

In the last few weeks, we’ve made three comments on health statistics – one in England, about leaks of accident and emergency data; one in Scotland, on statistics on delayed discharges; and one on analysis at the UK level. They all show the importance of improving the public value of statistics.

On accident and emergency statistics, I wrote to the heads of key NHS bodies in England to express concern about recent leaks of data on performance.

Leaks of management information are the antithesis of what the Office for Statistics Regulation stands for: public confidence in trustworthy, high quality and high value information.

It’s really hard to be confident about the quality of leaked information because it almost always lacks context, description, or any guidance to users. On value, leaked information usually relates to a question of public interest, but it’s not in itself valuable, in the sense it’s not clear how it relates to other information on the same topic. Its separated, isolated nature undermines its value. And it’s hard for leaked information to demonstrate that it is trustworthy, because the anonymous nature of the “producer” of the information (the person who leaked it) means that motives can be ambiguous.

But leaks can highlight areas where there is concern about the public availability of information. And that was the constructive point of my letter: the NHS bodies could look into reducing the risk of leaks. One way of doing this would be to reduce the time lag between the collection of the information on accident and emergency performance, and its publication as official statistics. This lag is currently around 6 weeks – 6 weeks during which the performance information circulates around the health system but is not available publicly. Shorten this lag, I argue, and the risk of disorderly release of information may also reduce.

The comments on Scotland relate to the comparability of statistics across the UK. When NHS Scotland’s Information Services Division published its statistics on delayed discharge from NHS hospitals for February, the Cabinet Secretary for Health and Sport in the Scottish Government noted that these figures compared positively to the equivalent statistics in England.

This is of course an entirely reasonable thing for an elected representative to do – to comment on comparative performance. The problem was that the information ISD provided to users in their publication on how to interpret the Scottish statistics in the UK context was missing – it wasn’t clear that Scotland figures are compiled on a different basis to the England figures. So the comparison is not on a like for like basis. The difference wasn’t stated alongside the equivalent statistics for England either. This clarification has now been provided by ISD, and NHS England have agreed to make clearer the differences between the figures in their own publication.

For us, it’s really important that there is better comparability of statistics across the UK. While there are differences in health policy that will lead to different metrics and areas of focus, it’s quite clear that there is public interest in looking at some issues – like delayed discharge – across the four UK health systems.

In this situation, good statistics should help people make sound comparisons. Yet, with health and care being a devolved matter, there are some constraints on the comparability of statistics across England, Wales, Scotland, and Northern Ireland.  And, to the untrained eye it is difficult for users to know what is or is not comparable – with delayed discharge data as a prime example. This is why we really welcome the recently published comparative work, led by Scottish Government, where statisticians have created a much more accessible picture of health care quality across the UK, pulling together data on acute care, avoidable hospital admissions, patient safety, and life expectancy/healthy life expectancy across all 4 UK countries.

Both these cases – the leaks and comparability – illustrate a broader point.

Health statistics in the UK should be much better. They should be more valuable; more coherent; in some cases more timely; and more comparable. If statistics do not allow society to get a clear picture in good time of what is going on, then they are failing to provide public value.

Migration statistics

A key aim of the Office for Statistics Regulation is to be more systemic. We want to focus not on individual sets of statistics in isolation, but to look at how they are used alongside other datasets. This reflects our ambition to be champions of relevant statistics in a changing world.

Statistics on migration is an area that is ripe for this approach. We have been looking at migration statistics from several angles, reviewing  different aspects of migration. We have assessed the National Insurance numbers for adult overseas nationals statistics produced by the Department for Work and Pensions. We are reviewing the ONS’s estimates of student migration in the International Passenger Survey. And we are also looking at the way in which the ONS’s Labour Force Survey estimates the number of non-UK participants in the UK labour market. We will publish the results of these reviews over the coming months, starting next Thursday (26th January) with our assessment of National Insurance numbers for adult overseas nationals.

But when we produce this work we will also reiterate a broader point. There are a range of migration-related datasets available across different Government departments, including those from HMRC, DWP, ONS and the Home Office. The key to a comprehensive picture lies in bringing these datasets together. We will therefore emphasise the crucial role that John Pullinger plays as National Statistician in ensuring that there is a joined-up approach across Government.

This will build on the letter I wrote to John last March emphasising the importance of a comprehensive, coherent picture of migration. As I said then “it is particularly important that the different sets of data are brought together in a coherent way, fully quality assured and published in an orderly manner, to paint as full a picture as possible of the patterns of migration”. In 2017, we will continue to encourage a coherent approach to one of the most important areas of statistics in contemporary public debate.

The Office for Statistics Regulation

Today the UK Statistics Authority’s regulatory function is implementing a series of changes to the way the function operates, and moving forward from now we will be the ‘Office for Statistics Regulation’.

Statistics are a valuable public asset. But like any asset, they can be subject to misuse, not be maintained, or become obsolete. The prevention of these harms sits at the heart of the Authority’s strategy, and that’s why we are clarifying the Authority’s regulatory work through the establishment of a more clearly distinct Office for Statistics Regulation.

The purpose of the Office for Statistics Regulation is to enhance public confidence in the trustworthiness, quality and value of statistics. We will continue to set standards through the Code; to uphold those standards, celebrating when they are met and challenge publicly when they are not. At heart, we want to be champions of relevant public statistics in a changing world.

So this is a significant change – a clear statement of the importance of regulation through the creation of a distinct and visible new Office for Statistics Regulation. But the change of name is not meaningful if it’s not accompanied by regulatory decisions that reflect our ambitions.

We have several important outputs in coming weeks. First, when we look forward, we see ever-growing interest in the role of operational and Big Data. So today we are publishing further work on administrative data. Second, we will publish the first of our new format Assessment reports over the next few weeks. These reports focus much more on the key outcomes we want from statistics – that they are trustworthy, high quality and high public value. Third, in mid-December we will publish the conclusion of our Code of Practice stock take, which will set an ambition for the Code to be an enabler, not a barrier, of the highest public value in statistics. And we’ve got important new work coming out, for example on National Insurance numbers for adult overseas nationals, and on improving the system of health and care statistics in England.

On the use of statistics, we’ve been busy this autumn, commenting on the use of statistics on grammar schools; on clinical standards in health and on increases in health spending. I expect this activity to continue.

Our most powerful tools are not a set of formal sanctions or detailed rules, but three other tools: first; our public voice; second, the National Statistics designation as a brand indicating the highest standards; and third the Code itself as a way of establishing norms of professional behaviour in the production and communication of statistics. I think this distinctive approach is important, and I’d like to write another blog about it in a few weeks time.

We will also strive for improvement in our own work. The work we are publishing today on administrative data consists of an honest review of the experience of statistical teams across government as they implemented our standard on administrative data. We find much to learn from, and much to respect too in the excellent work of the Government Statistical Service.

This striving for improvement underpins the move to an Office for Statistics Regulation. We are making this change to improve, because we recognise that in the past the Authority’s regulatory work has not always had the impact it should have had; and because we care deeply about the role of statistics as a core asset for government and for society.