COVID-19 Testing Data

UKSA Chair Sir David Norgrove has written to Matt Hancock, Secretary of State for Health and Social Care to reiterate concerns with the official data on testing and highlight the importance of good data as the Test and Trace programme is taken forward.

Statistics published by government should shed light on key issues. They should enable the public to make informed decisions and hold the government to account. The public interest in data around COVID-19 is unquestionable, we have seen this come through our media and social media monitoring as well as from the emails we have been receiving.

The government have made a commitment to improve the information available on COVID-19, including additional data on COVID-19 testing and testing capacity, which are now being published as well as a commitment to provide greater clarity on data collection methods and associated limitations which we look forward to seeing.

However, as Sir David Norgrove said in his letter, the data still falls short of the expectations set out in the Code of Practice for Statistics.

In Sir David’s letter he sets out his view that the testing data should serve two purposes:

  1. To support understanding of the prevalence of COVID-19, including understanding more about where infections occur and who is infected.
  2. To manage the testing programme – and going forward the approach to test and trace. The data should allow government and the public to understand how effectively the programme is being managed.

The data currently published are not sufficiently clear or comprehensive to support these aims.

The Office for Statistics Regulation champions statistics that serve the public good and we will continue to work with officials in the Department of Health and Social Care as it works hard to develop these important data.

Adult social care statistics across Great Britain: the power and potential for change

Social care is in crisis across Great Britain. With responsibility for social care being a devolved matter, each country has its own system in place but all share common problems. Given the urgency of the crisis, which is exacerbated by the current Covid-19 pandemic, there is growing interest in what others are doing and what works and what doesn’t. While a fair bit of work has been done looking at the international picture, we know less about what is happening closer to home. What are the other British countries doing, what do they have in common and where do they diverge? What can we learn from each other’s successes and failures, and what solutions can we adopt?

At the Nuffield Trust, we have spent the last couple of months immersing ourselves into the different social care systems across the UK, trying to get underneath their peculiarities and identify common ground and points of learning. But what we found was a frustrating lack of high-quality, comparable data. Delving into the data landscape across the UK countries, we quickly learned to realise the limitations of our undertaking and the urgent need for data improvements.

Within this context, we welcome the publication of the review of adult social care data across Great Britain by the Office for Statistics Regulation (OSR). The call for better data and greater collaboration across the countries is timely, and the individual country reports are an invaluable resource for understanding the strengths and weaknesses of social care data within each country and the underlying policy differences.

Why data matters

Lack of funding in social care does not only mean that services are unable to meet demand. Equally important is the under-investment in data and analytics. This is not just a frustrating problem for the small group of social care researchers and data enthusiasts. Poor data matters in very practical terms. It hinders the development of a strong evidence-base to solve problems and guide effective policy making aimed at improving care delivery and the quality of life of service users and their families. It means that we cannot understand if one country’s approach has been successful and how it might work in another place.

For instance, during the run up to the 2019 general election, some proposed the introduction of free personal care in England akin to the system established in Scotland. To assess whether this or other proposals would be a viable or desirable option for England, we need to look at data. It sheds light on the distinct contexts, the nature of the different social care systems and the impact of different policies and initiatives. It allows for comparisons and evaluation of best practice and it lends concrete evidence to lived experience and anecdotes. Sounds simple but such a comparison proved far from straightforward.

So what would help?

  1. Ease of access

Easy and straightforward access to relevant data is crucial to aid research, policy making and the public in making informed decisions. However, similar to the data users the OSR spoke to, we struggled to find relevant statistics, often having to navigate different websites and sources, in some cases with data deeply hidden. Of help were portals aiming to collate all available data, such as the National Social Care Data Portal for Wales. Seeing this realised across the countries will help to increase the use and profile of social care statistics.

  1. Comparability

Once data has been accessed, we need clear definitions and improved guidance on what statistics entail, as well as detailed statistical commentary. Across all countries, data quality issues and missing returns are a problem where their impact was not always clear.

Another core issue identified by the OSR report, and in our work, is the comparability of data across social care activities, time and regions. Wanting to compare key social care indicators across the countries, we often struggled to understand what exactly is captured. Even comparing headline figures such as the number of people receiving care in each country was far from straightforward. For instance, does access data include signposting to services as opposed to actual delivery of some form of social care activity? Does it include meals on wheels or are those reported separately?

  1. Consistency

In addition, wanting to evaluate the impact of different policies and approaches, we require consistent data collection and reporting over time, but are faced with breaks in timelines and changing definitions and processes. As highlighted by the OSR, where possible, data producers in all countries should follow harmonised principles, such as on informal care, which would aid comparability across regions and settings.

  1. Comprehensiveness

Linked to this, data did not always meet our needs. We would have liked to get more understanding of characteristics of those receiving care, such as more granular age breakdowns, their level and nature of needs, and the type of care and care setting. And we identified many gaps in the data that limited our ability to understand the intricacies of the different systems. Across the board, robust data on informal caregiving or self-funders are lacking and estimates at best. Likewise, we have little understanding of undermet and unmet need, making it difficult to appreciate how well the different systems meet demand and planning for the challenges ahead.

The way forward

The publication of the review of adult social care data across GB is very timely and welcome. It is an important advocate for change, calling for improvements to existing statistics, including data coherence, standardisation and quality, filling data gaps, and sharing of best practice. Given the Covid-19 crisis, the need for good data has become more apparent than ever, as has the crucial role of social care in our society. Implementing the report’s recommendations will aid researchers in developing solutions through a robust data landscape and facilitating cross-border cooperation and learning. This will ultimately improve care for all citizens across the UK.

We hope that respective bodies will take the findings to heart, seeking to improve ease of accessibility and data quality to facilitate greater use of high-quality data, standardising processes to ensure comparability across settings and countries, and filling the gaps in the data. Where resources are low and problems are shared, this is best done in collaboration.

This is a guest blog from Laura Schlepper (Research Analyst, Nuffield Trust)

The (almost) mental health ‘data revolution’

Mental health has been in the forefront of news and debate for a while now. This continued focus can only be a good thing for those needing better care, the workforce in the field, and the services provided.

In 2016, the NHS England ‘Five Year Forward View for Mental Health’ called for a data and transparency revolution. I loved this call to arms – it simply and passionately inferred that without data to monitor and hold the health service system to account, a disservice to mental health patients would occur.

So, that was then. Where are we now? A bit further along Revolution Road but not there yet.

We have more data collected and published providing insight on mental health services and prevalence of mental health disorders, which is great. For example, in 2015 it was difficult to know how much NHS England spent on mental health services, but in 2018 we can find a figure for this in the Mental Health Five Year Forward View Dashboard. Statisticians have also been working hard to create a Mental Health Services Dataset, which can provide greater insight into the services provided for adults and children accessing mental health care. And I look forward to the publication of the National Study of Health and Wellbeing, providing the much needed updated prevalence estimates of mental health disorders for children and young people in England.

What more can be done to deliver the revolution? Well, we know that being transparent helps demonstrate trustworthiness. Trust in numbers isn’t built by simply publishing data. More is needed. The data need to be easily found, contextual information is needed to enable everyone can assess the data in terms of accuracy, and everyone should be able to understand and easily use the data.

But, when looking at mental health data in England it is difficult to make sense of them.

This point was demonstrated after a Twitter debate between the Secretary of State for Health and Social Care and actor Ralf Little. Some claims were contradictory and it was difficult to determine why. Other claims couldn’t be verified because data weren’t published – either in the breakdown needed or at all. These difficulties were all brought to light by Full Fact’s work to communicate to the public a fair and accurate verification of the data used. But these difficulties indicate to me that statisticians need to do more to help people find, understand and use these important data. And by people, I mean everyone because often it is the needs of experts that are catered for, and other’s needs are not.

Sir David Norgrove and I have had to intervene on several occasions in the last three months asking statisticians to make their data more accessible (as listed at the end of my blog), assessable, and useable (in the words of Baroness Onora O’Neill). Statisticians have responded well to our interventions, for example by providing new insight into the mental health workforce and NHS spending on mental health services. And these changes will hopefully enhance future public debates and discussions about mental health.

But, I would rather that improvements like these were made without our intervention. A data and transparency revolution requires much more to be done by the different organisations that publish mental health data to enable people to find, understand, and use such important data.


Health and Social Care Statistics

In February, I asked NHS England and NHS Digital to consider publishing more timely statistics on accident and emergency performance. In particular, I asked them to look at the time lag between collection and publication of the statistics.

I’m impressed by their response, which we’ve put on our website today.

They propose to reduce the time lag from 6 weeks to 2 weeks, along with changes to provide more timely data in other important areas, such as ambulance services.

This is great news for users of these statistics, and it fits with the Office for Statistics Regulation’s broader drive to improve the coherence, accessibility and value of health and social care statistics.

Over the past two years I have focused a lot on health and social care statistics. This reflects the underlying value of these statistics. They have the power to help save lives and to help make life more comfortable for those who need it – and of course it has ever been thus: iconic figures in the history of statistics like John Snow and Florence Nightingale crop up frequently (perhaps to the point of cliché…) in speeches and presentations about the power of statistics to change lives.

The Office for Statistics Regulation identified the need for improvements in the accessibility and coherence of and insight provided by health and social care statistics in England in 2015. Since then we’ve proceeded by dialogue, with producers (though a series of round tables) and with users, through our health summit in July 2016.

After our most recent round table, we’re more confident that producers of these statistics are taking ownership of the project to improve health and care statistics. We decided that this work no longer needed us to organise or chair discussions. Producers themselves will carry on this cross-departmental dialogue.

The changes to accident and emergency statistics announced today further reflect progress that producers are making.

My team is continuing to review health and care statistics to ensure user needs are being met and we are also going to be investing time in holding a second ‘Health and Social Care Statistics Summit’ in the late Autumn. If you would like to participate in this or have any thoughts on areas we should be looking to do more work – whether commending progress or seeking improvements – please let me know.

And of course it’s far too early to declare victory. Health and social care statistics, with their power to save and improve lives, will remain a key priority for me. I look forward to and expect many more cases of reform and change.

Health statistics

In the last few weeks, we’ve made three comments on health statistics – one in England, about leaks of accident and emergency data; one in Scotland, on statistics on delayed discharges; and one on analysis at the UK level. They all show the importance of improving the public value of statistics.

On accident and emergency statistics, I wrote to the heads of key NHS bodies in England to express concern about recent leaks of data on performance.

Leaks of management information are the antithesis of what the Office for Statistics Regulation stands for: public confidence in trustworthy, high quality and high value information.

It’s really hard to be confident about the quality of leaked information because it almost always lacks context, description, or any guidance to users. On value, leaked information usually relates to a question of public interest, but it’s not in itself valuable, in the sense it’s not clear how it relates to other information on the same topic. Its separated, isolated nature undermines its value. And it’s hard for leaked information to demonstrate that it is trustworthy, because the anonymous nature of the “producer” of the information (the person who leaked it) means that motives can be ambiguous.

But leaks can highlight areas where there is concern about the public availability of information. And that was the constructive point of my letter: the NHS bodies could look into reducing the risk of leaks. One way of doing this would be to reduce the time lag between the collection of the information on accident and emergency performance, and its publication as official statistics. This lag is currently around 6 weeks – 6 weeks during which the performance information circulates around the health system but is not available publicly. Shorten this lag, I argue, and the risk of disorderly release of information may also reduce.

The comments on Scotland relate to the comparability of statistics across the UK. When NHS Scotland’s Information Services Division published its statistics on delayed discharge from NHS hospitals for February, the Cabinet Secretary for Health and Sport in the Scottish Government noted that these figures compared positively to the equivalent statistics in England.

This is of course an entirely reasonable thing for an elected representative to do – to comment on comparative performance. The problem was that the information ISD provided to users in their publication on how to interpret the Scottish statistics in the UK context was missing – it wasn’t clear that Scotland figures are compiled on a different basis to the England figures. So the comparison is not on a like for like basis. The difference wasn’t stated alongside the equivalent statistics for England either. This clarification has now been provided by ISD, and NHS England have agreed to make clearer the differences between the figures in their own publication.

For us, it’s really important that there is better comparability of statistics across the UK. While there are differences in health policy that will lead to different metrics and areas of focus, it’s quite clear that there is public interest in looking at some issues – like delayed discharge – across the four UK health systems.

In this situation, good statistics should help people make sound comparisons. Yet, with health and care being a devolved matter, there are some constraints on the comparability of statistics across England, Wales, Scotland, and Northern Ireland.  And, to the untrained eye it is difficult for users to know what is or is not comparable – with delayed discharge data as a prime example. This is why we really welcome the recently published comparative work, led by Scottish Government, where statisticians have created a much more accessible picture of health care quality across the UK, pulling together data on acute care, avoidable hospital admissions, patient safety, and life expectancy/healthy life expectancy across all 4 UK countries.

Both these cases – the leaks and comparability – illustrate a broader point.

Health statistics in the UK should be much better. They should be more valuable; more coherent; in some cases more timely; and more comparable. If statistics do not allow society to get a clear picture in good time of what is going on, then they are failing to provide public value.