Ethnicity facts and figures

Credible statistics that command trust are an essential public asset and the lifeblood of democratic debate. Statistics should be used to provide a window on our society and the economy. The value of data lies in its ability to help all in our society – from members of the public to businesses, charities, civil servants and Ministers – understand important issues and answer key questions.

The launch this week by the Cabinet Office of the Ethnicity facts and figures website is, in this context, a substantial achievement.

The website provides data from across Government departments on how outcomes from public services vary for people of different ethnicities. Some of this data has previously been published and some not.  The website highlights many disparities in outcomes and treatment from public services.  Specialists in particular areas – such as health, housing, and criminal justice – may have been aware of some of the data but few will be familiar with all of it.

What makes the Ethnicity facts and figures website so valuable is that it draws together detailed information from across government, and presents it accessibly, neutrally and dispassionately, onto a website – and all the data can be downloaded. What’s striking is that the website isn’t that flashy in its use of visualisations and other data tools. It presents the data, and describes them clearly and succinctly. Doing this, it provides a clear picture for visitors to the website.

This reflects the huge effort put into asking people what they want from the website – including members of the public, academics, central and local government, NGOs and open data experts.

This really is a model for how all statistics should be developed: find out what questions people across society want to answer, and figure out how best to present the data to them. It shows how Government departments could do much more to publish data with the public users in mind – rather than simply publishing data in the way they always have done. Focusing on the public users opens up the opportunity for innovative ways of presenting statistics.

So in my view this website is already starting to add value. But it’s also clearly still under development – there’ll be more data added to the website, and other refinements as the website responds to the ways people are using it. And no doubt it will open new avenues for research and policy intervention: the website makes information available for the public to ask the question ‘why?’. That’s the first step to understanding.

And there’s one further thing to celebrate. Alongside the Ethnicity facts and figures website, the Cabinet Office has published a Statement of Compliance with the Code of Practice for Statistics.

Though the website draws on official statistics, it is not itself an official statistics publication (though it could be in the future) – for example it didn’t follow the standard approach to publication that we expect of official statistics. Here the Statement of Compliance is really helpful as an exercise in transparency. It’s clear on the judgements and process that have gone in to developing the website and recognises that it doesn’t follow the Code’s publication protocols.

And the Statement draws strength from the draft Code’s three pillars – trustworthiness, quality and value – and explains how the work has been done using the pillars as a framework.

This is in effect the first example of what we call voluntary compliance – using the Code not as a statutory obligation but as a best practice guide.

On this voluntary approach, as in much else, the Ethnicity facts and figures website is an exemplar.

Health statistics

In the last few weeks, we’ve made three comments on health statistics – one in England, about leaks of accident and emergency data; one in Scotland, on statistics on delayed discharges; and one on analysis at the UK level. They all show the importance of improving the public value of statistics.

On accident and emergency statistics, I wrote to the heads of key NHS bodies in England to express concern about recent leaks of data on performance.

Leaks of management information are the antithesis of what the Office for Statistics Regulation stands for: public confidence in trustworthy, high quality and high value information.

It’s really hard to be confident about the quality of leaked information because it almost always lacks context, description, or any guidance to users. On value, leaked information usually relates to a question of public interest, but it’s not in itself valuable, in the sense it’s not clear how it relates to other information on the same topic. Its separated, isolated nature undermines its value. And it’s hard for leaked information to demonstrate that it is trustworthy, because the anonymous nature of the “producer” of the information (the person who leaked it) means that motives can be ambiguous.

But leaks can highlight areas where there is concern about the public availability of information. And that was the constructive point of my letter: the NHS bodies could look into reducing the risk of leaks. One way of doing this would be to reduce the time lag between the collection of the information on accident and emergency performance, and its publication as official statistics. This lag is currently around 6 weeks – 6 weeks during which the performance information circulates around the health system but is not available publicly. Shorten this lag, I argue, and the risk of disorderly release of information may also reduce.

The comments on Scotland relate to the comparability of statistics across the UK. When NHS Scotland’s Information Services Division published its statistics on delayed discharge from NHS hospitals for February, the Cabinet Secretary for Health and Sport in the Scottish Government noted that these figures compared positively to the equivalent statistics in England.

This is of course an entirely reasonable thing for an elected representative to do – to comment on comparative performance. The problem was that the information ISD provided to users in their publication on how to interpret the Scottish statistics in the UK context was missing – it wasn’t clear that Scotland figures are compiled on a different basis to the England figures. So the comparison is not on a like for like basis. The difference wasn’t stated alongside the equivalent statistics for England either. This clarification has now been provided by ISD, and NHS England have agreed to make clearer the differences between the figures in their own publication.

For us, it’s really important that there is better comparability of statistics across the UK. While there are differences in health policy that will lead to different metrics and areas of focus, it’s quite clear that there is public interest in looking at some issues – like delayed discharge – across the four UK health systems.

In this situation, good statistics should help people make sound comparisons. Yet, with health and care being a devolved matter, there are some constraints on the comparability of statistics across England, Wales, Scotland, and Northern Ireland.  And, to the untrained eye it is difficult for users to know what is or is not comparable – with delayed discharge data as a prime example. This is why we really welcome the recently published comparative work, led by Scottish Government, where statisticians have created a much more accessible picture of health care quality across the UK, pulling together data on acute care, avoidable hospital admissions, patient safety, and life expectancy/healthy life expectancy across all 4 UK countries.

Both these cases – the leaks and comparability – illustrate a broader point.

Health statistics in the UK should be much better. They should be more valuable; more coherent; in some cases more timely; and more comparable. If statistics do not allow society to get a clear picture in good time of what is going on, then they are failing to provide public value.